CapeSoft.Com
Clarion Accessories
Replicate
Documentation
Support
CapeSoft Logo

CapeSoft Replicate
Support

Download Latest Version JumpStart FAQ History
Installed Version Latest Version
If your LogManager is not replicating, check out the Replicate Error Flow Diagram.

 Can't find what you're looking for? Click here to search the entire Replicate documentation.

Frequently Asked Questions

Note: Some of these questions are answered comprehensively in the docs, so I've just linked the question directly there.

Upgrading from a previous version?

Questions on programming my application

Dictionary/File Setup Related Questions

LogManager Programming Questions

LogManager Setup and Running Questions

Logging and LogFile Questions

General Questions


1.1. How do I cater for auto-numbering in my database if I have a primary key with a unique auto-number field?

Answer: The concept of auto-numbering is probably the biggest difference between a "single" database type application, and a replicated one. The approach Replicate takes is to allow 2 invoices to have the same number. Obviously though they then need an additional field to make them unique - and we introduce this via the Site field. Each database location is called a Site, and all auto-numbered keys then consist of 2 fields, the site field and the auto-incrementing number. Thus the customer goes from being 120 to CPTN120, and JHBG120 and so on. The last field (in the key) must be the auto-incrementing field.

1.4. I have existing data which I am adding Replicate to. How do I populate the GUID field, since when I convert the files it is blank?

Answer: The best solution is to use FM3 (or FM2). This will auto-upgrade the files for you and populate the GUID as well. If this is not an option, you will need to write a routine for each file as follows:

ThisRep.SetGlobalSetting('NoLogging','+')
set(MyFile)
Loop until access:MyFile.next()
  MYF:GUID = ThisRep.GetGUID()
  access:MyFile.update()
end
ThisRep.SetGlobalSetting('NoLogging','-')


1.5. Which object should I use in my application?

Answer: There are 3 different objects that are part of the Replicate classes: the csLog, csLogManager and csLogConnectionManager classes. The csLog class is the most basic, containing the ability to log changes to files (adds/edits/deletes) and setup of the local site information. The csLogManager builds on the csLog class and handles the importing and exporting of the log files. The csLogConnectionManager builds on the csLogManager class to implement the transportation of the log files (i.e. the sending and receiving of the log files).

Each Site must only have one application with the csLogManager (or csLogConnectionManager) base class running at a time (the LogManager program). In the majority of instances, this means that you will require 2 separate applications - the LogManager program, and your application (if you want more than one instance of your program running at any site). In this case, your application must have the csLog as the base class. The only instance where you would use the csLogManager (or csLogConnectionManager) class in your application is if there will only be one instance of your program running at each site. The LogManager program should be running in the background all the time, so that it can handle the importing and exporting when needed.

1.6. I have a loop with a manual add that is coming up with a duplicate error. What am I doing wrong?

Answer: You need to clear the GUID before performing the add.

clear(PRE:Guid)

You may actually prefer to clear the entire record which has the same effect:

clear(PRE:Record)

Note: If you are doing a recursive add on a form (using ABC) you will need to do the following to clear the GUID:
  1. Go to your Global Embeds | Global Objects | Abc Objects | File Managers | FileManager for <YourFile> | PrimeAutoInc | CODE (Before Parent Call)
  2. Insert the following code:

    clear(PRE:Guid)

1.7. How do I derive my own class - I want to use my own method of...?

Answer: The simplest is to derive it locally (i.e. using the embed points), but if you want to override a method for all your applications, then instead of locally deriving the method in all your applications (and your LogManager), you can derive the class and use your own. Here is a simple example of writing your own methods to Get, Update and Add settings (for example if you want to store them in a TPS file).
  1. You need a .clw and a .inc file (like Replicate's) - let's call it MyRep.clw and MyRep.inc. For example:

    MyRep.inc:

    OMIT('_EndOfInclude_',_MyReplicate_)
    _MyReplicate_ EQUATE(1)
    include('replicate.inc')
    !--------------------------------------------------------------------------------
    !Class MyLogConnectionManager
    !--------------------------------------------------------------------------------
    MyLogConnectionManager Class(csLogConnectionManager),Type,Module('MyRep.Clw')
    GetSettingWindow PROCEDURE (<string Setting>),string,name('MyLogConnectionManager.GetSettingWindow') ,VIRTUAL
    AddSetting PROCEDURE (string Setting,<long option>),string,name('MyLogConnectionManager.AddSetting') ,VIRTUAL
    UpdateSetting PROCEDURE (string setting,string value),name('MyLogConnectionManager.UpdateSetting') ,VIRTUAL
    GetSetting PROCEDURE (string setting),string,name('MyLogConnectionManager.GetSetting') ,VIRTUAL
    END ! Class Definition
    !--------------------------------------------------------------------------------
    _EndOfInclude_

    MyRep.clw:

    Member()
    Include('MyRep.inc')
    !-----------------------------------------------------------------------------------
    MyLogConnectionManager.GetSetting PROCEDURE (string setting) ! Declare Procedure
    Ans String(255)
    CODE
    !Put your code here
    Ans = parent.GetSetting(setting)
    return(Ans)
    !------------------------------------------------------------------------------
    !-----------------------------------------------------------------------------------
    MyLogConnectionManager.UpdateSetting PROCEDURE (string setting,string value) ! Declare Procedure
    CODE
    !Put your code here
    parent.UpdateSetting(setting,value)
    !------------------------------------------------------------------------------
    !-----------------------------------------------------------------------------------
    MyLogConnectionManager.AddSetting PROCEDURE (string Setting,<long option>) ! Declare Procedure
    ReturnString cstring(255)
    CODE
    !Put your code here
    ReturnString = parent.AddSetting(setting,option)
    return(ReturnString)
    !------------------------------------------------------------------------------
    !----------------------------------------------------------------------------------
    MyLogConnectionManager.GetSettingWindow PROCEDURE (<string Setting>) ! Declare Procedure
    ReturnString CSTRING(256)
    CODE
    !Put your code here
    ReturnString = parent.GetSettingWindow(setting)
    return(ReturnString)
    !------------------------------------------------------------------------------


    A bonus extra is that you will still be able to use the object embeds to derive the class locally (in your application) if you want to.
  2. In the Replicate Global Extension template, check the Derive checkbox and enter your MyRep.inc in the Include file entry field, and the name of the class (MyLogConnectionManager) in the Other Class entry field.
  3. Enter the MyRep.clw file into your Project (External Source files).
    You will be able to use this class for all of your applications (LogManager as well).

1.8. I have a MsSQL database and a TPS database that I would like to Replicate between. How do I do it?

Answer: Replicate is driver independent so it won't be an issue with what driver you are using. You will need a complete EXE set for each driver that you use. In other words you will need your application and the logmanager application compiled for TPS and another application and logmanager compiled for MsSQL. Please note the SQL caveats in the docs.�You must make sure that your dictionary uses external field names for all the fields in the files that are replicated.

For this kind of Project I recommend using:
  1. Multi-Proj. You can use MultiProj to compile the same application to different EXEs using different FileDrivers from the same dictionary. This means that you don't have to maintain an application and a dictionary for each file driver that you are using.
  2. FM3. Handles database upgrading automatically (for both TPS and MsSQL).

1.9. I would like to create an audit trail of changes that a user has made. How do I do this?

Answer: You simply need to set the user property as soon as it can be set. For example:

ThisRep.SetGlobalSetting('user',ds_CurrentName(AppNum))   !For Secwin users who want to use the Secwin loggin name

1.10. I would like to turn logging off at some sites - how do I do this?

Answer: This is best done by disabling Replicate completely at a site (this is laid out in one of the Useful Tips). You can also turn logging off on the fly as well. This needs to be done with care because of the implications of not logging data. Basically all you need to do is to turn the NoLogging flag on:

ThisRep.SetGlobalSetting('NoLogging','+')

1.12. I want to make the file structure changes, but don't want Replicate at this stage in my app. How do I populate the GUID fields?

Answer: Add the Replicate global extension template to your application as per normal (after making the file structure changes laid out in the What you need to change in your Dictionary section of the docs), and in the options tab, check the 'Make Replicate InActive (GUID generation only)' checkbox.


1.13. I'm having problems replicating my external files

Answer:
  1. After making a changing to the field containing the name of the graphics file, then check that the logfile contains the change correctly (check the An external graphics file entry in the logfile for more information).
  2. If it does not, then it means either you have not setup the replication correctly, or else replicate cannot find the file. Put a stop in the derive PrimeLog method:

    Check If Picture Exists screenshot
    If the stop does not appear, then Replicate is not detecting a field change. If the stop does appear, then check the Path and filename, and if it's not correct, then your path or filename has not been setup correctly on the Replicate global extension template.
  3. If the entry into the logfile does exist, then check the outgoing folder for the file (your application will place it there on field change). If it's not there, then you probably don't have access to the outgoing folder for that machine. Check access rights for the outgoing folder (and that you have not customized the name of the outgoing folder name).

1.14. From a parent form I insert a child, but the Parent's GUID is not primed for the child insert.

Answer: If you don't have a autonumbered field, then the parent record is only inserted when your click the OK button on the parent form (which is when the GUID will get generated). What you can do is in the PrimeFields method (or the PrepareProcedure if you're using legacy) you can do a:

PRE:GUID = ThisRep.GetGUID()

Replicate will only prime the GUID on insert if the GUID is blank.

2.1. Is it possible to conditionally replicate record, based on the flag set in the record?

Answer: Yes. Please see the example in deriving your own methods. One thing you need to note: If you set the flag (upon insert), the record then will not be propagated, so any subsequent changes (even if the CUS:DontReplicate flag is cleared (although there are ways around this though - <g>).

You need to derive the PrimeLog method as well and place similar

2.3. How do I know which tables I should add the Site field to and which I shouldn't?

Answer: There are basically 3 types of replication tables: the global table, the site-related table and the global-site table.

Questions to ask to determine which table type your table is (you'll need to do this for each table in your dictionary):
  1. Do I need to ADD records at more than one site.
  2. Does this table have unique keys (other than the GUIDkey)?
  3. Do I need to have every record in this file at all my sites?
  4. Do I use Subset Replication?

    If (1is No or 2 is No) and 3 is Yes then
      this is a Global table
    Elsif 3 is No or 4 is No then
    ��this is a Site-related table
    else
      this is a Global-Site-related table

    1. The global table does not have a site field. It is designed to be replicated throughout the site tree, but can only be added to at one site (usually the primary site) - unless it has no unique keys (other than the GUIDKey).
    2. The Site-related table has a site field (which should be added to unique keys (other than the GUIDKey) - in order to preserve their uniqueness), which will allow that table to be added to at each site. The replication of those records are limited to the parent (and grand-parent) of that site (or a particular range of sites if desired).
    3. The Global-site table is a combination of 1 and 2. It allows replication of the record throughout the site tree, and allows each site to add records to the table (containing unique keys other than the GUID). Add the site field to this table, but call the site field something else (like Origin). Treat the Origin field (in your programming and dictionary) exactly like you would for the site field. Replicate will not recognize the Origin field as a site field, and so it will not impose site limitations on these records when they are replicated.

    If you omit a Site field from a table, then you must limit where this file can be added to, or else Auto-numbering keys (or any other unique keys) will clash - and 2 sites will contain two different records (albeit with the same unique key) - which cannot be replicated. In the above example, you should limit Product table inserts to the Head Office.

    If we decide that all the branches should have access to all the customers, then we must either remove the SiteField from this table (allowing insertions only at the primary site), disable Subset Replication or rename the SiteField to 'Origin'.

2.4. I cannot get my records (based on an ID field) to auto-increment. How do I do this?

Answer: You probably need to Initialize the Site field in your table.


2.6. I have one table definition and have multiple files which use this table definition (switch filenames at runtime).

Answer: You can't use Replicate in this scenario. You need to change your file structure in order to have one file per table declaration. You can have Superfiles (i.e. multiple tables in one file) though.


3.1. I want to license the amount of sites Replicate will replicate to.

Answer: This is a little tricky, because it's difficult to know exactly how many sites you are replicating to (especially with subset replication). The best way (and probably the only way) is to:
  1. Limit the addition of sites to the Primary Site (disable the Site insert button if ThisRep.Site <> ThisRep.ParentSite or ThisRep.ParentSite = ''). 
  2. Then to create a filter (for your license counter) based on the number of actual sites (not relationships).

The easiest way to do this would be: SIT:ThisSite = ThisRep.site and make the SIT:ThisSite field in the SiteFile insert form READONLY. The problem with this method is that you will only be able to add children to the primary site - you will not be able to add children to a site that is not the primary site.


3.3. How do I encrypt my logfiles?

Answer: Simply check the Use Capesoft Encryption on the LogManager Setup tab and enter an Encryption key in the field provided below, and your logfiles will be encrypted. Encryption occurs after compression (if compression is used). This optimizes compression.


3.4. Is it possible to trigger replication manually every time I update something in the Parent Site?

Answer: Yes. You need a communication mechanism (like NetTalk to inform the relating LogManagers that they need to ProcessLogFiles). This is outside the scope of Replicate, but is very possible with NetTalk or other such communication tools.


3.6. What is subset data replication, and how do I institute it?

Answer: To institute Subset Data Replication, you need:
  1. a SiteHi field in your SiteTable.
  2. to define the SiteField in your tables.
  3. to set a Site field identifier in your Global Extension template.

Note: Changes to files without the Site field will be propagated to all the children (as there is no site field to institute the limit on).

Note: If a Site's SiteHi field is clear, then a complete log file will be propagated to it. If SiteHi = Site, then a subset of data will be made containing only the changes that pertain to that site.

Conversely, if you do not want subset replication (i.e. you want to send the whole log file to all the children), you simply clear the Site field identifier in your Global Extension template - or omit the Site Range fields in the SiteFile tab of the Replicate global extension template (SiteHi and SiteLow).


3.7. How do I institute one-way Replication?


Answer: One-way Replication (on a file-level) is often more of an issue of authorization. Remember we want all changes to be propagated throughout (with subset filtering) in order to ensure that all the sites are mirrored. What you may actually be wanting is to prevent B100- B400 from editing the product information. A good way of doing this is disabling the update buttons on the product browse, and only enable them if this is the Primary Site. Example code to do this (after opening the window):

if (ThisRep.ParentSite = ThisRep.Site) or (ThisRep.ParentSite = '')
  do EnablePrimaryControls
end


If you are sure that one-way replication is what you're after, then One-way Replication by Site can be set-up fairly easily. Add a DontSendFiles flag to your SiteFile (in your dictionary), and set the field up on the SiteFile tab in the Replicate Global Extension template (of your LogManager application). Set the flag at the site from which you don't want to send files in the record where ThisSite = that site and RelatingSite = the site we don't want to send to using the Site update form in the LogManager at runtime.

If you don't want any of your changes to be included into the logfile, then you can set the NoLogging property in your ReplicateObject.Init method (after the parent call) in your LogManager application:
if self.site = self.parentsite or self.parentsite = ''
  self.SetGlobalSetting('NoLogging',1)
end



3.12 The SendEmail routine is never called. What am I doing wrong?

Answer:
  1. Check the Send Email Procedure is correct on the SiteSetup tab of the Replicate Global Extension template.
  2. Check the presence of the EmailAddress field  on the Site File tab of the Replicate Global Extension template.
  3. Check that each relating site has (in fact) got an email address in the EmailAddress field in the site file. If there is no email address for a site, then the logfiles will not be emailed.


3.14. I have FM2/3 added to my application, should I add it to the LogManager application as well?

Answer: You must add FM2 to your LogManager program.


3.15. How do I get replicate to do a dial up connection?

Answer:
  1. Create the DUN window.
  2. Call the Dial-up window from your ProcessLogFiles button (on your Browse Sites window) and remove the code template that is there.
  3. In the 'ConnectionsChanged' method, place the 'Process incoming and outgoing logfiles' code template (the one you removed from the Browse Sites).
  4. Place a close window command at the end of the source code generated by the above code template.


3.16. How do I disabling the warning messages from appearing?

Answer: On the Replicate Global Extension template's Site tab, you'll find a checkbox "Suppress Warnings" - check this checkbox.


3.20. I would like to suppress ALIASes when doing a full export.

Answer: The best way of doing this is to skip the insert method before it writes to the file.

In the LogManager's Global Embed: Replicate | ThisRep | Insert | Before the Parent Call:

if lower(FileLabel) = 'aliaslabel' and self.FullExport
return
end


You need to also create the FullExport property (Other Properties embed point) and set it immediately before doing the fullexport and clear it immediately afterwards.

What you could do in the CRCCheck, is also create a property (say self.PerformingCRC), derive the FileCRC method, and set it as you go in, and clear as you come out.

In your OpenFiles method,

  if self.PerformingCRC
    case clip(FileLabel)
    of 'AliasTable1'
    orof 'AliasTable2'
      return (1)
    end
  end

3.21. How do I make that replication occur less often to one site than to the other sites?

Answer:
  1. Add 3 fields to your sitefile: TimeBetweenTransactions, LastTransactionTime and LastTransactionDate.
  2. There's an embed point where you can write code to skip the processing. In your global embeds:
    Replicate | ThisRep | AutoProcess | 3a) In loop - before processing

    if SIT:TimeBetweenTransactions > 0
      if (((today() - SIT:LastTransactionDate) * 8640000) + (clock() - SIT:LastTransactionTime)) > SITLastTransactionTime
        exit
      end
    end


3.22. I want to customise which tables are replicated. How do I do this?

Answer: You can derive (and in fact override) the Registration method. The tables (that are not suppressed) will be registered in this method. You can manual code the file registering before the template generated code and return before the template generated code is called.

For Example:

self.Register(Company,'Company',COM:record,COM:Guid,RepHCOM:Record,COM:Site)
self.RegisterArray('COM:Phone',RepCOM:Phone,RepHCOM:Phone,Rep_TableIsThreaded,'Company')
self.Register(Contacts,'Contacts',CON:record,CON:Guid,RepHCON:Record,CON:Site)
self.RegisterArray('CON:Phone',RepCON:Phone,RepHCON:Phone,Rep_TableIsThreaded,'Contacts')
self.RegisterArray('CON:TestDate',RepCON:TestDate,RepHCON:TestDate,Rep_TableIsThreaded,'Contacts')
self.RegisterArray('CON:TestTime',RepCON:TestTime,RepHCON:TestTime,Rep_TableIsThreaded,'Contacts')
if self.ParentSite = self.site        !Only handle table changes if this is the parent site.
 
self.Register(NotesFile,'NotesFile',NOS:record,NOS:Guid,RepHNOS:Record)
end 
return


3.24. Processes constantly, generating a request to server or client at least every second.

Answer: Check your TimeToProcess variable. This is moved to a global variable: RepGLO:TimeBetweenTransactions. Set the initial value to the time you would prefer. There is an option on the Replication ControlWindow to adjust this, but you may have deleted this control on purpose (to prevent tampering by users).


3.25. I selected "Direct Only" when I initially made my LogManager. Now I need to add FTP/Email but I can't find any of the template prompts anywhere?

Answer: I suggest you remake your LogManager including Email and FTP support. The direct only one is really for people who don't own (or don't want to include) NetTalk in their LogManager. This basically excludes all NetTalk code. However to get it all setup correctly, making a new LogManager from scratch is normally the way to go.

3.26 I get an Access Denied message on one of the logfiles during import. How do I get around this?

Answer: If you turn on silent (in the Global Extension Template), then this message will not be displayed - and the logfile will simply be skipped and will continue processing logfiles from the other sites. The next time that the LogManager does a process, the logfile will be processed. If you still want to display the Replicate windows, but only want the decompression to be on silent, then:

ThisZlib.silent = 1

is the property to set (the template will set it in the ThisRep.init method).


4.1. I'm trying to implement the Site range, but I am not getting all the records passed to my relating site.

Answer: You need to make sure that you have the Site range variables are filled in in the Replicate global extension template. If the Site range low is always the same as the Site, then you can omit then SIT:SiteLow variable. It is (however) a good idea to fill it in anyway, because at some stage you may want to have a mirror site (with the same data range as the parent site), and the site-related records < site will be omitted if this is not filled in.


4.2. How do I start the Replication process (i.e. what must I do to start the automatic export and import)?

Answer: Your own site record (in the SiteTable) will be automatically added when you first run either your own program or the LogManager program. The SIT:ThisSite (in your SiteTable) will be contain ThisRep.Site (your own site) for the records (in the SiteTable) that pertain to your site.

The best way to introduce a new site to the site tree, is to create a child site from the parent. This is done in the LogManager and there is a routine that comes in the created LogManager that will do this for you. This will create a brand new child site and setup the connection between it and the parent. The only thing you will need to do, is make sure that the INI file that is created is placed in the correct place for the new site's logmanager to pick up it's connection details to the parent.

Alternatively, if you have 2 existing sites, that are independent from each other and need to start a relationship, then there are 2 ways to introduce a new site into the Site Tree:
  1. Copy it's log files to the incoming directory of it's parent. You can then edit the site record (that was automatically added to the parent's site) giving the parent the correct details of the new site. The parent will then begin exporting it's logfiles to the new site's mailbox/incoming directory. The new site will then follow the same procedure as it sees a log file from a site that is not in it's SiteTable.
  2. Alternatively, you can add a record to the parent site's (or the new site's) site table, ensuring that the mailbox/incoming directory details are correct, and the next time the site does an export, it will export it's log files to the new site. The new site (upon receiving the log files) will add a new site record to it's site table. You must ensure that the mailbox and/or incoming directory of the parent's site is correct.


4.4. I'm doing a complete import, but not all my (for example) customers are being imported.

Answer: Replicate is designed this way. You will see that the records that are imported into the Company table only pertain to that site (or in that Site range). It throws away records that don't belong to that Site (or it's children). If you add records to a table without a Site field limiter (such as the notes table), and import those, you will see that those are completely imported because there's no SiteField in the table, so Replicate knows that it must import all of those records.


4.5. I'm trying to run the examples, but I cannot find any files called abcmanb000.exe etc.

Answer: When you run the install program, you need to check the 'Install examples' check box (which is a checkbox on the Install Options window).


4.6. I can't get the replication working. What am I doing wrong?

Check out the Replicate Error Flow Diagram first.

Answer: Somewhere, the transporting of the files is being broken down. Transporting the logfiles is basically made up of the following steps:
  1. (optional - Indirect method) The logfiles are fetched (from the mailbox/FTP directory or whatever) and placed in the Incoming directory.
  2. (optional - compression) The logfiles are decompressed.
  3. The logfiles are then moved to the LogPath.
  4. The Logfiles are then imported.
  5. The LogManager then copies the Logfiles out of the LoggingPath and into the Outgoing directory. This is to prevent your programs from adding to the Logfiles while the LogManager is exporting.
  6. (optional - subset rep.) The LogManager then creates a logfiles for each related site (this is the subset replication part) and places it in the other site InDir.
  7. (optional - compression) Each logfile is then compressed into the z file.
  8. (optional - subset rep.) The subset logfiles are then deleted.
  9. (optional - Indirect method) The logfiles (or zipped logfiles) will then be transported

Things to check:

  1. This is where most people struggle to get Replication going.

    There's a common collection-distribution point for the sending site to the receiving Site. If you are using the Indirect Method, then this means a mailbox, or if you're using the Direct Method then you must ensure that the directories that the log files are being copied to is the incoming directory of the other site. It's a good idea to place a string indicating the relevant site information (like the incoming, outgoing, site ID, parent ID and Site Hi range), which will make it easy to check these links.

    Think of the SiteTable as a Site-address book. Each site has it's own address in it, as well as all it's relating site's addresses in it. The detail that that site needs is where to send the logfiles to the relating site - this is via email, FTP and/or Direct - and these details need to be there in order that the site knows where to put the logfiles that it is Sending to the relating site.

    If you're wanting to send something to one of your clients, you send it to their address/fax machine/email - each client has a single address/fax machine/email. Similarly with sites, each site has it's own incoming email address/FTP directory/Incoming directory. All logfiles that it needs to import/receive will be at its incoming email address/FTP directory/Incoming directory. It will send logfiles to however many relating sites it has - to those site's email address/FTP directory/Incoming directory.
  2. You may be doing subset replication, so your record inserts into a parent site have a site field that are out of the site range to be exported to the child site.
  3. If you are doing subset replication, and you are compressing the files, then it may be that your OutGoing directory is the same as the InDir for the relating site. In this case, the Processing of the logfiles (Step 6 - creating the LogFile subsets) is aborted and so the logfiles are not compressed.

Still struggling?

OK - crucial thing is that you track the logfiles' movement. What I would do initially is to turn off compression and encryption (on the Advanced tab of your Replicate Global Extension Template in your LM). Stick to running 2 LMs on the same machine so you don't have to fight with file access rights just yet.

Stick to Direct transport at this stage - we'll move on to FTP (if you want) at a later stage.

Run both LMs and click the pause button.

Step 1:

Make a change in the child site, click process on the child's LM - check the OutGoing directory, then the OutGoing\ParentSite directory - any files in either of these?

Yes? Then you've got the parent's incoming dir wrong at the child's site. Back to step 1.

Nope? Move to the parent's incoming directory - see the logfile there? Cool.

Step 2:

Open the logfile (that's in the parent's incoming directory) and make sure your change is in the logfile - in there?

No? OK - you've made a change that's not replicatable to your parent (either outside the parent's site range - or else part of a table that's not replicated). Go back to step 1 and make a change to a table that is not site limited and that you know is replicated.

Step 3:

Click Process at the parent LM.

Did it process the logfile? (Check in the incoming directory - is the logfile gone)

No? Then we've missed a logfile somewhere along the way. Click process on both LMs a couple of times to get them back in sync again.

Yes? Check your changes are brought across.

For email transport: Don't forget to set the Helo on your SendEmails routine of the LM.


4.7. I want to create a mirror-site of one of my sites. How do I do this?

Answer: If you want a child to have complete replication, then you need to set the SiteHi range field in the SiteTable to blank. You must also set the SiteHi property (on the child side) to blank.


4.9. I want to synchronize some sites via the local network (i.e. using the direct method) and some via an Email/FTP connection.

Answer: You must use the ConnectionManager. For the sites that are in contact directly (across a LAN), omit the SIT:EmailAddress from the Site file and point the SIT:DirectInDir directly to the Incoming directory of the relating site. In this way it will act like a LogManager.


4.11. I get 0Byte logfiles (with a tmp suffix) in FTP directories. What is the problem?

Answer: It is probably an issue with your Firewall or DNS server. Check that your Firewall is letting through both commands and data (FTP has 2 channels) - and data should be unlimited as there could be a packet size limiter on the data channel (which is the channel in which all the logfiles are uploaded on).

Also experiment with the Active and Passive mode FTP setting.


4.13. I create a new site and a huge logfile is sent to the child, and the child logs all these new inserts and sends it back to the parent. Why?

Answer: Basically the new site is verifying the inserts that it has performed with the parent. This logfile forms a backup with which you can restore from.

However, if you have your own backup system, then you won't require this large initial log file, in which case you can switch logging off at the new child site for this initial logfile. You do this in the Auto Create a Child Site control template settings in the AutoCreateSite procedure of your LogManager. Clear the Log full data import at child site checkbox.

If you have your own method of creating an initial dataset (i.e. copying directly, etc.), and don't require the complete log of the dataset on child creation, then in the same place, you can clear the Perform full data export for the new child site checkbox to disable the complete log file creation.


4.12. All my logfiles (from 1 to x) are being sent over to my new child site. How can this be rectified?

Answer: You need to replicate your site table in order to get the correct logfile number to the new child when it is created, from where it should begin importing. This is done in the SiteFile tab of your Replicate Global Extension Template in your LogManager.


4.14. I have a number of logfiles in the incoming directory of a site - even after processing at that site.

Answer: If you are left with a number of logfiles in your incoming directory, then this indicates that that site (call it B100) is missing a logfile (or a number of logfiles) from the site that is sending those logfiles (check the first 4 digits of the filenames for the site ID - let's say B110). B100 will send a request to B110 requesting those missing logfiles. B110 will then send the missing logfiles and B100 will then import all the files in the incoming directory. If the request from B100 to B110 is not getting to B110, then it will not know to send the missing logfiles. At B100, look in the log\outgoing\B110 folder and see if there are a number of files waiting to be sent. If so, then this is indicative that there is a problem in the transport of the logfiles from B100 to B110.

4.15. I have a number of logfiles in the outgoing folders of a site - even after processing at that site.

Answer: This is indicative that there is a problem in the transport of the logfiles from the current site (lets say B100) to the relating site (lets say B110). If your are receiving logfiles from the relating site in the incoming directory (check after processing B110 if there are any B110xxxx.z or.ze files there) - then you probably have a setup issue at B100. Check the record B100/B110. For more details, check the JumpStart tutorial to get 2 sites Replicating section of the docs..


4.16. I'm controlling the LogManager from my application, but it doesn't startup / shut down with my app.

Answer: The LogManager uses NetTalk (a TCPIP port) to talk to the Control Center client (or your application). If your LogManager is not shutting down with your application, then it's normally one of the following:
  1. Your ports are mismatched (i.e. the client is using one port, while the server is using a different one). Get the ports the same value.
  2. You're using a variable to set the port, and the variable is 0 or a string value. Use a long, and make sure that it's set to >2000.

4.17. How does the LogManager handle missing logfiles?

Answer: Depending on the options you've setup in your LogManager will depend on how the LogManager reacts to a missing logfile.

See The Global Extension Template for an explanation dependent on your setup.


4.18. Not all my updates are replicated. I'm using MySQL.

Answer: Check the 'MySQL: Verify connection is open before parsing logfile' checkbox in the Advanced tab of the Replicate globl extension template on your LogManager. Sometimes the MySQL ODBC driver has a phantom connection drop.


4.19. I'm using FTP, but my files aren't being replicate.

Answer: Using Netdemo (the example application that ships with NetTalk), connect to your FTP server, and upload and download files. You may need to adjust the mode (passive /active) - or other things like a non-standard port to get the FTP working. Once you've got it working using the netdemo application, you can move those settings to your logmanager. Note: You should not login to the base directory. Your indir should be a subfolder.


4.20. The Logmanager is not deleting a related site's logfiles after they have been imported

Answer: You are most likely running the LogManager with the /RepDebugAll or /RepDebugFile switch in the command line (or using the debug window). Remove this from the command line. You'll need to manually delete the related logfiles that already exist in the logfile directory, but future imported logfiles will be removed automatically after a successful import.


5.2. I'm having new files created and often my log files are not logging my file changes.

Answer: You could be using the Filedialog command, which could be causing your data path to change. If you use the Filedialog command, then you need to set the necessary attribute that saves and restores your path.


5.3. In one procedure, my changes appear in the logfile, but not in the database.

Answer: You may need to clear the GUID field. If the GUID field is populated on an Insert, then the GUIDKey will have a duplicate error and the record will not be inserted. Simply add the following code before your add command:

clear(<FilePrefix>:GUID)


5.4. My entries to a file are not being logged, and neither is the GUID field in that file being populated (i.e. it's blank).

Answer: This means either that that file has been suppressed (if this is a multi-DLL, then you need to check your DataDLL and your current app for the suppression), or that Replicate itself has been disabled in either the Data-DLL (for multi-DLL applications) or the main exe. If you are using a legacy application, then you need to call the GetGUID method before your add command (for suppressed files):

PRE:GUID = GetGUID()
add(Filename)


5.6. I have a process whose file changes are not being logged. What must I do?

Answer: You are probably using Clarion5.5, where there is a bug in the FileCallback PUT(View) (Clarion limitation). The best is to either upgrade to Clarion 6 or don't use the Process template.

5.7. I'm getting really large, bloated logfiles - and they seem to be growing exponentially.

Answer: Replicate does have some overhead for the HTML tags, so it could be that you have 40 bytes that are changing (from a handful of different table). Each byte change requires a field header tag at the start and end of the tag. You could have very long field names. Each record entry change, requires a time (and sometimes date) stamp, as well as a site stamp, a GUID, the table identifier, possibly the user and machine (if these are set). This can relate to quite a large amount of overhead - in terms of logfile size.

The other thing that could be happening is that your users are often forcing a LogManager close in the middle of a logfile process. Check out the Aborting in the middle of the ProcessLogFiles routine in the Useful tips section of the docs as to why and how to work around this problem.


5.8. How does Replicate handle logout/commit/rollback?

Answer: During logout, Replicate logs file changes writes to a temporary logfile. Upon commit, it will write the entire contents of the temporary logfile into the real logfile. Upon rollback it simply deletes the temporary logfile.

5.9. How many logfiles per site does Replicate support?

Answer: The logfiles can go up to 2920080 logfiles. The first 65535 are in hex, from there on (G000) alphanumeric (i.e 36) and other valid filename characters (!,#,$,%,&,-,^,_,~) are used in the logfile name. So this means that if you are generating a logfile every 10 minutes for 10 hours a day (i.e. 60 logfiles per day per site), it will take 133 years to run out of logfiles.

5.10. How can I encrypt a file's field as it get's written to the logfile (and then decrypt it when it get's imported to the child site)?

Answer:
The method you will use is very similar to the conditional ignorefield method. The only difference is that you will encrypt the field in that derived method, instead of conditionally ignoring the field (which you may do as well prior to encrypting on writing to the file, or after decrypting when reading from the file in the logmanager).

6.1. I am getting corrupt records when saving from a form. What am I doing wrong?

Answer: You are probably calling a routine (like a process or something) that accesses the file that is being updated from the form. The record and saved record variables are being populated with other values. You need to save and reload the variables on either side of the call to the routine as follows:

      LocHSAL:Record = RepHSAL:Record
      LocSAL:Record = SAL:Record
      ProcessSalesDetails()
      RepHSAL:Record = LocHSAL:Record
      SAL:Record = LocSAL:Record


Note that you need to save both the History of the record and the Record itself in order for Replicate to log the changes correctly.


6.2. How does Replicate handle concurrency checking/conflict resolution?

Answer: Scenario:

If Joe changes Bruce's mailing address to 1 main street and an hour later, Mary changes the address to 11 main street - and BOTH Joe and Mary are and should be authorized to make this change - what happens?

On your multi-user LAN system today.

The only difference of course is that in a Replicate program Joe saw 2 when he changed it to 1, and Mary saw 2 when she changed it to 11. In a live case Mary sees 1 when she changes it to 11. Question is - since Mary and Joe both think they have the _right_ number, they're going to enter it _regardless_ of the existing value.

Thus, although the question seems to suddenly be important (in a replicated environment) the truth is that it behaves _exactly_ the same as your LAN program has always behaved. It assumes that people entering data are entering correct data.

And if you start with the assumption that users are entering _incorrect_ data then you're losing before you begin...

We actually looked at it in some depth at the start of the design process, and to be honest there is no good way to do conflict resolution, even when you detect that this situation occurred. Consider;

We'll make a log - give the log to a conflict-manager, and he must resolve it.

This is completely unworkable in the real world, and will be completely ignored. Imagine if you had 5 sites (which by their nature are geographically disparate.) Mark is the conflict manager (like he doesn't have enough to do already !!). So he gets a report saying "Joe says 1, but Mary says 11". So Mark phones Joe:

M : "hello Joe, Mark here"
J : "Mark - I don't know no stinkin Mark... "
M : "Mark du Stud, Head Office, Data Conflict Resolution Manager"
J : "Yeah, so what..."
M : "Well you entered Bruce's Mailing Address as 1 main street"
J : "Yeah so what..."
M : "Well is that the correct value? Mary says it's 11"
J : "Of course mine is right... why would I enter a wrong value... get lost you poncy git..."

So Mark phones Mary. Guess what the outcome there is?

So what does Mark do? Most likely he just picks one. Say he get's 10 of these a day. By Friday guess what he thinks? He thinks _your_ program is a load of trash. It's made his life much harder, not easier. Besides what actually is having him do this achieving?

Not having a conflict manager, and referring the conflict back to the participants (Joe and Mary) is even more useless. All that does is make everyone think the program is a waste of time, and doesn't actually lead to any better data.

Plus it's such a limited case. What happens if Joe changes it on Monday, it gets replicated, and Mary changes it on Wednesday. You have _exactly_ the same situation, but the computer thinks this is all hunky dory. As it should. And indeed this is what we expect from our computers.

Now there may well be situations where the _from_ value matters, but most of the time the _from_ value is simply ignored. The person is capturing the data _because_ the from value is wrong. I don't look at the address and say "well 1 is almost 11, I won't enter 11". It doesn't matter if the from value is blank, 1 or 100. It 'aint 11. If I think 11 is the right value, then 11 I will enter. As programmers we assume people are thinking about what they're doing, and that the value they type in will somehow be different based on the from value. Of course in 99% of cases this is a myth.

Hence to answer your original question - what conflict?


6.3. How does Replicate handle updates to the same record at 2 different sites between synchronisations?

Answer: Replicate handles field level changes, so if a user changes the phone number at one site and at another the user changes the address, then these field-level changes will both be replicated correctly. If both operators change the field to the same value then both changes will be ironed out. If both operators change the same field of the same record to different values then read the question on conflict resolution.


6.4. I am still not sure how the Hi & Low limit works.

Answer: Low Limit is taken as your SiteID (or SiteLow field if you have this as a field in your file) and the Hi limit is an optional limit that you can set. This means that (if you look at the SiteTree) - changes that are propagated up from site B300 to B000, will not be propagated down the B100 branch for Site Identified records (if the Hi Limit is set to B1ZZ).

How it works: When you're running the ProcessLogFiles routine, it first imports all the files that are in your incoming directory. It then loops through your Site Table (for all sites, where SIT:Site = ThisRep.site) and exports the log files to the relating SIT:FromSite s' incoming directories. While it's doing this it sorts through the log file making a log file subset containing only those records pertaining to those sites in that branch. It uses the range in the SiteTable (i.e. SIT:FromSite = rangelow or SIT:SiteLow = rangelow and SIT:SiteHi = rangehi) to make the logfile subset for each file. If the SIT:FromSite = ThisRep.ParentSite (our parent), then it just copies the files there.


6.6. What happens when the site value in a site related record changes?

Answer: If subset replication is in place (more details), then it could occur that the record that does not exist at the new site. This means that the record must be treated like an insert, alternatively it will be treated like an update (more details).


6.7. I don't understand the Last LogFileReceived Counters. Could you explain them?

Answer: We have 2 sites: D000 and D100. D000 is receiving logfiles from D100. D000 keeps track of the size of the last logfile number that is received from D100 (let's say in this case the file is logfile #2). When it receives the next logfile (still #2 - but bigger as more changes have been made), it will import from where it left off, so as not to re-import the entire logfile every time. During the course of the day it may receive a larger and larger version of logfile #2 as there are more and more changes made to the database (at D100). When a complete logfile #2 is received, the logfile is imported, and the LastLogFileReceived Number is incremented (to match the number of the complete logfile - in this case #2) and the Size indicator is reset (to 0). It then receives the next incomplete logfile (#3) - which has no EndOfFile stamp on it - and imports that, setting the LastLogFileReceived Size to match that of the logfile imported, but leaves the LastLogFileReceived Number at #2. It will continue importing updated versions of logfile #3 until it receives a complete logfile #3, in which case it will increment the LastLogFileReceived Number to #3 and reset the LastLogFileReceived size.


6.9. How come have I got some ztmp or zetmp files lying around?

Answer: The FTP procedure uploads files to a tmp file. When it's finished uploading it deletes (if one is there already) the correct name and renames the newly uploaded file to the correct one. If there are those files lying around, it means that during upload something happened to the FTP server connection, and so it didn't get to the end of uploading and rename it.

This will not affect the synchronising of your sites, as the corrupted upload file will merely be re-requested by the child site in question. These should be deleted by the sending LogManager on the next logfile upload that it will perform.

6.10. I am adding Replicate to an app with existing non-related multiple databases. How do I merge the sites' data?

Answer: Your tables can normally be classified into 3 table types in this scenario:
  1. Tables who's records need to remain unique for each site (i.e. Global site-related or site-related tables. See FAQ2.3 for details)
  2. Tables who's records match anyway (typically this would be a list of static info - like country codes or department names, etc).
  3. Tables who's records should have the same id - but don't because the different sites have been adding to the table independently of each other.

Case 1 - the data can be left as is as these files will be either global site-related files or site-related files, so the site id will be added to the table to ensure that the records remain unique throughout the site tree. The child tables of those tables must have a site id for that relationship added to the table (and key) - to remain correctly linked.

Case 2 - So long as the field used to reference a record from other tables is the same across the sites (typically State Codes or the like) - then you can decide on a master file, and physically copy that file to the other sites - after replicate has started on that particular site.

Case 3 - This is the tricky one. typically in this scenario you have customers or products tables - where you want the same record to be used in all the sites, but the sites already have their existing customers or products.

The best way in this scenario is to add the site field to this table - sync the data (so you're now left with a set of data for each relating site at the parent). Now you need to create a process that matches records (probably based on fields you want to match up to determine a unique record) - that you must run at the parent. I'd add a new record for each match that's found at a new id number (say 20000 and up) and then set the site id to say $$$$. Make sure that you have relational cascade on change in the dct - and then change both of the matching records to the new ID (which will cascade down). Keep the relation just matched on the ID (i.e. not the site field as well) - you'll need to add the site field after (in priority) the id field in that key. You can then delete the 2 (or more) old matching records so that only one new record exists. You need a manual matching queue so that you can match up fields that still have a site id after the automatic process has been completed.

U.1. What do I do when upgrading from a Replicate version prior to Beta 15?

Answer: You need to do the following (unless you handcoded your ProcessLogFiles):

For Email based transport system:

  1. You will need to change the csLogEmailManager class to the csLogConnectionManager class.
  2. If you are doing subset Replication (of logfiles), then you will need to check the Make logfile subsets for regional distribution checkbox in your Global extension template. (if not you will get a compile error instructing you to do so)
  3. You need to delete your SendEmail and ReceiveEmail procedures, and run the ImportReplicateEmailFTPxxx template utility (in the Application menu | Template Utility).
  4. If you handcoded a logfile request, using the previous method, then you need to delete this, because this will be done automatically for you.
  5. If you are using the Email transport method and did not choose the easiest way in Step 6, then you will need to delete any instances of the If File request is received, send files code template (which you may have placed in your ReceiveEmailProcedure). This is only if you used the old method of request missing logfiles.

For FTP based transport system:

  1. You will need to change the csLogManager class to the csLogConnectionManager class.
  2. Staying in the Replicate Global Extension template, change to the LogManager Setup tab and check the Allow for FTP transport checkbox and enter SendFTP and ReceiveFTP in the respective drop combo fields below.
  3. Staying in the Replicate Global Extension template, change to the Site File tab and fill in the relative Site File field (that contains the FTP Directory of the relating site) in the FTPDirectory entry field.
  4. You will need to remove the code that you inserted in your %ProcessLogFilesTransport embed point
  5. You will need to remove the call to ReceiveFTP at the beginning of your ProcessLogFiles routine. This will be located in your BrowseSites procedure.
  6. If you are doing subset Replication (of logfiles), then you will need to check the Make logfile subsets for regional distribution checkbox in your Global extension template. (if not you will get a compile error instructing you to do so)
  7. If you are doing runtime transport method selecting - then you need to check the Test for the local directory checkbox in your Global extension template. (if not you will get a compile error instructing you to do so)
  8. You will need to delete your existing SendFTP and ReceiveFTP methods and run the ImportReplicateEmailFTPxxx template utility (in the Application menu | Template Utility).
  9. In the Global Embeds, remove the 4 declared variables (pertaining to FTP) in the Replicate | ThisRep | Other Class Properties embed.
  10. In the Global Embeds, remove the code in the Replicate | ThisRep | Configure embed.


U.2. What do I do when upgrading from a Replicate version prior to Beta 16?

Answer: You need to do the following:
  1. If you set the INIFile in the Replicate | ThisRep | Init method, you need to move this to the Replicate | Set INI File for Replicate's settings embed point.
  2. If you placed code in the Replicate | ThisRep | Configure embed point (for setting up your own method of transport) you need to move this to the respective Replicate | ThisRep | Setup embed point.
  3. If you are upgrading from a version prior to Beta 15, then you also need to complete the steps in the beta 15 upgrade FAQ as well.


U.3. What do I do when upgrading from a Replicate version prior to 1.30Beta?

Answer: This only applies to LogManagers using Email or FTP as their transport mechanism. Re-run the template utility to import the LogManager into your LogManager (you can do a partial import which only imports the Transport, Backup and SiteSetup windows).


U.4. What do I do when upgrading from a Replicate version prior to 1.39 beta?

Answer: This only applies to those using FTP. Re-run the template utility to import the LogManager into your LogManager (you can do a partial import which only imports the Transport, Backup and SiteSetup windows).


U.5. What do I do when upgrading from a Replicate version prior to 1.46 beta?

Answer: This only applies to those who are doing their own calls to the Init and Kill methods.

NB Change - If you are handcoding the call to the Init and Kill methods, then you need to set the ThisRep.Active (for Clarion5.5) or the ThisRepGlobal.Active (For Clarion6). There is a warning that is compiled if you are doing this. To turn the warning off, go to the Class tab of your Replicate Global Extension template, and check the Don't warn about setting the Active Property checkbox. But you must set the active property before your Init and Kill methods.

ThisRepGlobal.Active = true


U.6. What do I do when upgrading from a Replicate version prior to 1.49 beta?

Answer1: Remove the _NoReplicate_ project define from your project defines in both your application(s) and your LogManager.

Answer2: Re-import the SiteSetupWizard window and the AutoCreateChild window.

Answer3: This only applies to those who are using FTP for transport. The FTP transport mechanism has been simplified. FTP is categorized in the following 3 ways if your LogManagers:
  1. uses only one FTP server and only FTP transport.
  2. uses only one FTP server, but other transport methods as well (you'll be affected the most in this case).
  3. use multiple FTP server set-ups (which is not a very regular occurrence).
If you fit into category 1 and 2, then you can basically remove your FTP fields from the site file - these become obsolete - as the setup is done in the Replicate Global Extension template now. This will mean that you setup you FTP Server details once and then you don't have to alter those again.

If you fit into category 2, then you need to add a SIT:NoFTP flag to your SiteFile. This should be set (to 1) at those sites that don't use FTP (although it does not have to - since the NetTalk FTP errors should be suppressed in any event) where ThisSite = Site and RelatingSite = Site.

If you fit into category 3, then you need to set default FTP settings in the Replicate Global Extension template. You can override these settings in the SiteFile if you need to (this was the old method of setting up FTP).


U.7. What do I do when upgrading from a Replicate version prior to 1.69 beta?

Answer: In version 1.69, the classes were re-constructed and some properties were moved to the GlobalSettings class. If you have handcoded calls to these settings, then you will need to change the syntax to include the GlobalSettings class rather than the old properties in the threaded class.

So:

self.Active => self.GlobalSettings.Active

self.NoLogging => self.GlobalSettings.NoLogging

When setting these properties, you must use the SetGlobalSetting method, or else setting the property will not be thread safe. Example:

self.SetGlobalSetting('active',1)

When using this method of settings properties, you must have already called the ThisRep.init method, otherwise the internal object pointer will not be set and your application will GPF.

For a complete list of the threaded and unthreaded (Global) settings, check the Object documentation.

U.8. What do I do when upgrading from a Replicate version prior to 1.98 beta?

Answer: A number of the properties have moved into the Global unthreaded class. This is to cater for the lack of threading control in the FileCallback methods (which is where Replicate does all the work in logging your file changes). As a result, if you refer to any of these properties in your handcode, this code will generate compile errors Syntax error: Field not found and Unknown identifiers). The following 4 properties are affected: inside, inited, logoff and TempOpened. You need to change a read from (e.g.):

if self.logoff = 1

to

if self.GetGlobalSetting('logoff') =
1

and a write from (e.g.):

self.logoff = 1

to

self.SetGlobalSetting('logoff',1)

Tip: you can increment and decrement the above 4 properties as follows:

self.SetGlobalSetting('logoff','+')        !Increments the self.GlobalSettings.Logoff[thread()] property
self.SetGlobalSetting('logoff','-')       
!Decrements the self.GlobalSettings.Logoff[thread()] property


U.9. What do I do when upgrading from a Replicate version prior to 1.99 beta?

Answer: With the need for aborting - it is really handy to have an external application to manage your Logmanager.

Create a new LogManager based on the controlcenterclient LogManager (if using WinEvent, then you will need to disable the Add Icon to the System tray control template in the frame)

You can either:
  1. Create another controller.app. The controller.app needs to be an ABC application (no dictionary required) - and run the Create a LogManager Controller utility template.
  2. Create the necessary controls in your application to control the LogManager.
There's a couple of things you need to change:
  1. The name of the LogManager EXE that the controller will call (this is in the template prompts of the Local LogManager Controller controls)
  2. The Port (on the options tab) which needs to match the port on the LogManager ControlCenter client settings.
Check out Useful Tip 23 for more details.


U.10. What do I do when upgrading from a Replicate version prior to 2.03 beta?

Answer: The control templates that exist on the ReplicationControlWindow (or BrowseSites window) have been merged into one control template. The old templates are still included in the build, but will be unsupported in the future. It is recommended that you move over to the new template by creating a new LogManager.

Replicate error flow diagram - a LogManager troubleshooting guide

Replicate Error Flow Diagram


Replicate Error Flow Diagram



Here is a screen shot of the BrowseSites window in the example LogManager with explanations of what each setting means. You can use this as a reference for some of the terminology used in the flowchart above.

Site File Description

The highlighted record (with the grey background) is the record describing our Site. You can use these settings (for our site) directly from this record in the SiteFile, or use settings from the INI File.

Errors and Warnings associated with Replicate

Replicate's Runtime Error messages and warnings
Clarion runtime errors associated with Replicate and runtime GPFs
Compiler warning messages and errors associated with Replicate

Replicate'sRuntime Error Messages and Warnings

First, let's look at what the warnings are, and then let's look at how to use these messages

Replicate run time error messages and warnings

The Replicate Warning messages are all titled 'Replicate Warning (x)', where 'x' is the index number of the warning. In the above case, the index number is 5.

There are 2 severity of errors - a warning (11 - 59) and an error (60 - 99). You can easily turn off warnings by setting the property SuppressWarnings as follows:

self.SetGlobalSetting('SuppressWarnings',1)

Warning 11

Default Text (translatable):'You need to set a Site|in order for data replication to occur.|Would you like to set it now?'
Occurrence 1:When no site has been set yet.
Occurrence 2:The INI file containing the site information is not found.

Warning 12

Default Text (translatable):'You need to set a directory|for the INCOMING data files.|Would you like to set it now?'
Occurrence 1:When no incoming directory for the log files has been set yet.

Warning 13

Default Text (translatable):'Logging path not configured.|Would you like to select a path now?'
Occurrence 1:When no path to place the log files in has been set yet.
Occurrence 2:The INI file containing the site information is not found.

Warning 14

Default Text (translatable):'You need to set a Parent Site|in order for data replication to occur.|Would you like to set it now?'
Occurrence 1:When no parent site has been set yet.

Warning 15

Default Text (translatable):FullFileName &  '|This file was not found for import.|This normally means that a log file was missed.|Import will be aborted. The File attempted was:' & LogFileName.
Occurrence 1:Replicate is attempting to import a logfile, which it is not ready for, i.e. the sequence of text files has been broken. For example: The file requiring to be imported is B1000004.log, but we have only imported up to B1000002.log. This means that file B1000003.log has been missed. The file will be automatically be requested from the respective site.
Occurrence 2:Replicate is attempting to import a logfile, and the log file has been given exclusive access to another program (i.e. with a text editor, or the like). Quit the other program, so that Replicate can use the file.

Warning 16

Default Text (translatable:'If you are doing conditional replication (i.e. only distributing subsets of data)|then you need to specify the highest site ID for the data in this subset.|Would you like to set it?'
Occurrence 1:When no parent site has been set yet and the 'Never' button was not pressed previously.

Warning 17

Default Text (translatable):'Error: The Site name in the file name|is different to the Site name in the header of the file:'
Occurrence 1:Occurs if the log file has been renamed. This is after basic check to ensure that the site name in the header matches the site name in the filename.

Warning 18

Default Text (translatable):'You need to set a directory|for the OUTGOING data files.|Would you like to set it now?'
Occurrence 1:When no outgoing directory has been set yet.
Occurrence 2:The INI file containing the site information is not found.

Warning 19

Default Text (translatable):'Some of the Email settings|have not been set-up yet.|Would you like to set them now?'
Occurrence 1:When some/all of the Email settings have not been set-up and you are using the csLogConnectionManager class.

Warning 20

Default Text (translatable):'A log file for a new site has been received. Would you like to auto add this site?'
Occurrence 1:When a log file arrives in the incoming directory, and the log file's site is not registered in our site file as a related site.

Warning 21

Default Text (translatable):'You have more than Incoming log file to be imported|but one prior to the last does not have an <EOF> stamp.|File: ' & ImportFileName
Occurrence 1:When a log file arrives with a higher number to the previous received log file, where the previous one was not received in completion (i.e. with an <End of File> tag at the end).
This normally occurs when a LogManager is setup incorrectly. It sends the first logfile to the wrong place. The place is then corrected and in the meantime the logfile number has incremented. Thus the next logfile is sent to the correct place, but the LogManager thinks that it has already sent the first one off correctly.
In the meantime, only the 2nd logfile is received at the relating site's LogManager. This is when the warning occurs and request for the missing logfile is immediately generated. This message should only occur until the next ProcessLogFiles is issued from the sending site. You can suppress this warning if you like, but it can be useful if there is a problem.

If you persist in getting this warning, it means that the request from the recipient site is not getting through to the sending site. Make sure that your site info (at the recipient site) is setup correctly so that it can send the request file to the sending site to request the missing logfile(s)
Occurrence 2:If the logfile that has the EOF stamp is received is smaller than a previous file of the same name, then this will occur. This is normally as a result of manual intervention with logfiles (i.e. manually editting logfiles or deleting logfiles from the logpath).
To solve this - you need to reset the incoming counters at the logmanager that is attempting to import the logfile. The record to reset the counters, is where ThisSite = the SiteID of that logmanger and where FromSite = the first 4 alphanumeric characters of the filename.

Warning 22

Default Text (translatable):'The Required path for replicating files was not found.|Would you like to create the following directory now?'
Occurrence 1:If you have setup a relating site's InDir (in the site file) but the directory does not exist, you will be prompted with this message. 
Occurrence 2:If you are replicating logfiles, but the incoming directory or the outgoing directory is not located, then you can create it.

Warning 23

Default Text (translatable):'Some of the FTP settings are not setup yet.|Would you like to set these up now?'
Occurrence 1:Your FTP settings have not been completely setup. If you don't require FTP for this site, then click the 'Never' button

Warning 24

Default Text (translatable):'If you are doing conditional replication (i.e. only distributing subsets of data)|then you can specify the lowest site ID for the data in this subset|(if it is not the same as this site''s Site ID).|Would you like to set it?'
Occurrence 1:You have setup your LogManager to allow for a SiteLow that is not the same as your Site ID. If it is, click the 'Never' button.

Warning 25

Default Text (translatable):'The Site value for the low range is higher than the SiteID itself.|You need to set one that is lower. Redo Now?'
Occurrence 1:Incorrect SiteLow setting.

Warning 26

Default Text (translatable):'The Site value for the high range is lower than the SiteID itself.|You need to set one that is higher. Redo Now?'
Occurrence 1:Incorrect SiteHi setting.

Warning 27

Default Text (translatable):'Logging Error : Site Name not configured'
Occurrence 1:There was an error configuring the site name, and an attempt is made to log a file change.

Warning 28

Default Text (translatable:'This Array is not Registered so import will not be updated (for this field)'
Occurrence 1:Array not registered and a Array change cannot be logged. This could occur in a Multi-DLL setup where the 'Generate all files' checkbox in the Global Properties is not checked.

Warning 29

Default Text (translatable):'This File in your database is not registered:|'
Occurrence 1:File not registered and a File change cannot be logged. This could occur in a Multi-DLL setup where the 'Generate all files' checkbox in the Global Properties is not checked.

Warning 30

Default Text (translatable):'This File was not available for exporting.|The exporting cannot continue until this file has been found:|'
Occurrence 1:The logfile specified is not available for exporting. This generally occurs when a logfile is deleted/harddrive is corrupt. We've implemented a couple of possible solutions for this occurance. Check out the Global Extension template on implementation of MissingLogFiles

Warning 31

Default Text (translatable):'The Checksum for this file is invalid, which means that the file has got corrupted.| Do you want to abort importing this file for now and request the file again?'
Occurrence 1:The CRC of the logfile received is incorrect. This means that the logfile received is not the logfile sent. The logfile will not be imported - and a request for the logfile will be automatically generated.

Error 60

Default Text (translatable):'Error Creating Log File: ' <LogFileName> <error>
Occurrence 1 (critical):LogFile cannot be created. This normally means that you don't have create/write rights to the directory that you are creating.

Error 61

Default Text (translatable):'Error Opening Log File: '<LogFileName> <error>
Occurrence 1 (critical):LogFile cannot be opened. The included error will give details as to why this error has occured.

Error 62

Default Text (translatable):'Error Closing Log File: ' <LogFileName> <error>
Occurrence 1 (critical):LogFile cannot be closed. The included error will give details as to why this error has occured.

Error 33 or 63

Default Text (translatable):'Error Trying to insert a log into the logfile:' <LogFileName> <error>

'Unable to Lock:' <LogFileName>
Occurrence 1 (critical):Trying to insert a FileChange log into the log file.
Solution:This is normally the result of too many applications trying to write to the log file simultaneously. You need to increase the number of times your application must attempt to write to the logfile (a setting in the global extension template on the Advanced tab). This will slow data file writing, but will ensure that replication is accurate.

Error 64

Default Text (translatable):'Error registering a file:' <FileName> <error>
Occurrence 1 (critical):A table in the dictionary cannot be registered in the internal Replicate file queue. This means that file changes cannot be logged.

Error 65

Default Text (translatable):'Error updating (add/put/delete) a record in file:' <FileName> <error>
Occurrence 1 (critical):There is an error importing a file change into one of the tables. Replicate aborts import on the following errorcodes: 5, 8 , 37, 75, 90. If you want to override a critical error you can do this in the CriticalErrorInImport method. In this way you can determine for your backend what fileerror is safe to continue importing, and what's not safe. We would rather halt the importing, then get you to assess whether it's OK to continue importing and make the necessary tweak to your LogManager, rather than continue importing illusively when the problem could rightly be highly critical.
Solution:1. The most common problem related to this is that you are running more than one LogManager on that particular site. You must only run one LogManager for each site.
2. Make sure that you've checked the 'Generate all file declarations' in your LogManager application.
3. Check that you're not excluding the <FileName> file from replication (either in the dct or in the Global Extension template).

Error 66

Default Text (translatable):'Error opening a database file:' <FileName> <error>
Occurrence 1 (critical):A table in the dictionary cannot be opened.

Error 67

Default Text (translatable):'Corrupt/incomplete log file:' <FileName> <error>
Occurrence 1 (critical):A logfile has been chopped off - importing this logfile (and subsequent ones) cannot be completed.

Error 68

Default Text (translatable):'Error opening request file:' <FileName> <error>
Occurrence 1 (critical):A Message/request file cannot be opened.

What to do with warnings?

All warnings are directed through the RepMessage method. This means that you can intercept the warning and redirect it to generate a process. The place to put your code to do this is in the RepMessage method before the parent call (in your global embeds):




You can test for a specific warning here like Warning 21 and do something when this warning occurs.

Clarion Runtime Errors associated with Replicate and runtime GPFs

  1. On some earlier versions of Clarion5.5 the Attempt to Use the (<filename>) before it is opened is sometimes prevalent.

    Cause: This (it appears) is related to the thread instability of Clarion5.5.

    Solution: Disable lazy open in the Global Properties by clearing the Defer opening files until accessed checkbox:



    If this still does not resolve this, then adding Capesoft's MessageBox to your LogManager application may resolve it.
  2. If you get the following error:

    GUID Duplicate Error

    Cause: You probably have a GUIDKey that allows duplicates.

    Solution: To find out which GUIDKey it is, run your program with the parameter /RepDebugFile in the command line and a stop will occur before this assertion indicating the name of the file. Open you dictionary and change the GUIDKey to a Unique key.
  3. If you get the following error:

    Site File Not Threaded

    Cause 1: Your datapath has not been setup before the path to your data has been setup.

    Solution 1: Move your path setup to before the ThisRep.Init call in the default (program) module. Alternatively, you can check the I'll do my own calls to the Init and Kill methods checkbox and handcode the calls to these two methods.

    Cause 2: When converting up to Clarion 6, you will get the following error (in your LogManager) if your SiteFile is not THREADed

    Solution 2: You need to open your dictionary and edit the SiteFile's properties. On the General tab, you will find a checkbox Open in current thread. This must be checked.
  4. If you get an immediate GPF when running your LogManager.

    Cause: You could be missing the zLib.DLL.

    Solution: Include the zLib.dll in your install set (i.e. to reside with your logmanager).

    Please note that even if you have compiled in localmode, you will still need to include this DLL in your install set.
  5. If you get a WSDIAL GPF while running your LogManager or your log enabled application.

    Cause: It seems that when you have changed something in your log enabled application and did not recompile the log manager that this gpf sometimes appears.

    Solution: The work around is to recompile the log enabled applications and log manager. Sometimes only the replicate object is rebuilt and then it works fine.
  6. If you get the following assertion error when exiting your logmanager:

    Log Manager In Debug Mode

    then you have your Project in Debug - Full (this only happens in ABC Clarion5.5). Turn the debugging off to make this go away:

    1. In the Application tree, select Project
    2. Then click the Properties button
    3. Then in Debug Information group (this is on the Debug tab in Clarion 5.5) select 'Off' in the Mode drop down list.
    This is caused by overzealous file checking from the ABC library. This was resolved in Clarion6.
  7. I get a "Unable to log transaction (48) attempting to frame the transaction on <filename>" error. If I disable the replicate extension everything works.

    Solution: You need to set each Replicated file to 'open in current thread'. This is a setting in the File properties in your dictionary.
  8. I get a 'File based range limit requires linking fields.' error when I run my LogManager.

    Solution: You need to create a relationship between the LogHistory and the SiteFile in your dictionary. BDE does not do this for you. SiteFile-> LogHistory (one:Many) with links on Site and RelatingSite fields.
  9. My incremental locators are not working.

    Solution: This is a problem in some older versions of Clarion 6 (up to at least 9056). You need to move the GUID field to the last field in the record structure.
  10. 10. I'm getting duplicate records in my browses.

    Solution: This is a problem in some older versions of Clarion 6 (up to at least 9056). The GUID field used to be generated with non-printable characters by the Replicate class - so records inserted prior to 2.39 will contain non-printable characters in the GUID field. This causes a problem in the browse class for displaying and re-populating the queue used in the browse. This should be fixed in later versions of Clarion 6.
  11. PC runs out of Winsock ports

    You need to change to use passive mode FTP. This can be a problem with Active mode.
  12. IPDriver application - server does not register all the tables

    Here's a tip from a fellow user: I found that if my GUID or my SiteID was listed as the first fields on my table structure in my dictionary, those tables would not show prefixes in the IP Driver and would not show up on an IP Server restart. I would then get Error 27 from my workstations when accessing those files. I had to move both GUID and SiteID to the middle or bottom of my table structure in the DCT. That solved the problem. All tables displayed correctly and IP Driver worked fine. I�m obviously not enabling replicate on my IP Driven workstations.

Compiler Error Messages

  1. (<program>.map 1,1) Link Error: Duplicate symbol: $RepGLOSite, file <program>.obj
  2. Missing SiteFile fields
  3. Syntax error: Illegal data type: THE_CSLOGEMAILMANAGER_CLASS_IS_OBSOLETE
  4. Syntax Error: Field not found: This_Embed_point_is_obsolete__Click_the_Edit_Errors_button_for_details
  5. Syntax Error: Field not found: Make_logfile_subsets_for_regional_distribution_is_obsolete__Click_Edit_Errors
  6. Syntax Error: Field not found: Test_for_the_local_directory_is_obsolete_here__Click_Edit_Errors
  7. Syntax Error: ThisRep.FM3Upgrading &= (ds_PassHandleForUpgrading())
  8. Warning: - Label duplicated, second used: _NoReplicate_
  9. Syntax error: Field not found: Active, User, Machine, NoLogging, etc.
  10. Syntax error:Unknown identifier PRD:GUID
  11. (<program>.map 1,1) Link Error: Unresolved External $REPGLO:SITE in <program>,obj
  12. Syntax error: Illegal reference assignment or equivalence
  1. (<program>.map 1,1) Link Error: Duplicate symbol: $RepGLOSite, file <program>.obj

    then you have a multi-DLL application in legacy templates. The Bulk Dictionary Editor adds a global data file to your dictionary and there is a bug in the legacy templates (prior to C55g) which can either be fixed by:

    1. Upgrading your Clarion5.5 to the G patch (which is highly advisable, as programs compiled with previous versions of Clarion5.5 are unstable with WinXP),

    - OR -

    2. Changing your Program.tpw template (which resides in your c55\template directory). You need to replace the following code:

    #MESSAGE('Generating Global Data',3)
    #FOR(%GlobalData)
    %GlobalData %GlobalDataStatement
    #ENDFOR
    #FOR(%CustomGlobalData),WHERE(%CustomGlobalDataBeforeFiles)
    %[20]CustomGlobalData %CustomGlobalDataDeclaration
    #FOR(%CustomGlobalDataComponent)
    %[20 + (%CustomGlobalDataComponentIndent * 2)]CustomGlobalDataComponent %CustomGlobalDataComponentDeclaration
    #ENDFOR
    #ENDFOR


    - WITH -

    #MESSAGE('Generating Global Data',3)
    #FOR(%GlobalData),WHERE(NOT %GlobalDataLast)
    #IF(%GlobalDataInDictionary AND %GlobalExternal AND %GlobalDataLevel = 1 AND %GlobalData <> '')
    %[20]GlobalData %GlobalDataStatement,EXTERNAL,DLL(dll_mode)
    #ELSE
    #IF(%GlobalDataLevel = 0)
    #SET(%ValueConstruct, 1)
    #ELSE
    #SET(%ValueConstruct, %GlobalDataLevel)
    #ENDIF
    %[18 + (%ValueConstruct * 2)]GlobalData %GlobalDataStatement
    #ENDIF
    #ENDFOR
    #FOR(%CustomGlobalData),WHERE(%CustomGlobalDataBeforeFiles)
    %[20]CustomGlobalData %CustomGlobalDataDeclaration
    #FOR(%CustomGlobalDataComponent)
    %[20 + (%CustomGlobalDataComponentIndent * 2)]CustomGlobalDataComponent %CustomGlobalDataComponentDeclaration
    #ENDFOR
    #ENDFOR
  2. Replicate Compile Error 2

    You need to check the Generate all File Declarations checkbox on the File Control tab in the Global Properties window.
  3. Syntax error: Illegal data type: THE_CSLOGEMAILMANAGER_CLASS_IS_OBSOLETE

    This means that you have upgraded from a version prior to beta 15. The csLogEmailManager class was superceded by the csLogConnectionManager class. You need to:

    1. Go to your Replicate Global Extension Template.
    2. On the Class tab, select 'csLogConnectionManager' from the list of available classes.
  4. Syntax Error: Field not found: This_Embed_point_is_obsolete__Click_the_Edit_Errors_button_for_details

    This means that you placed some code in the old ProcessLogFiles loop (like your own transport method). You need to:

    1. Cut the code from the %ProcessLogFilesTransport,'After processing logfiles - external transport mechanism' embed point.
    2. Go to your global embed points - Replicate | ThisRep | AutoProcessFiles | After processing logfiles - external transport mechanism and paste the code there.

    If you are using FTP for your transport, then it is not necessary to handcode it, as this is now built in to the csLogConnectionManager class.
  5. Syntax Error: Field not found: Make_logfile_subsets_for_regional_distribution_is_obsolete__Click_Edit_Errors

    This means that you have the 'Make logfile subsets for regional distribution' checked in this Code template. This option is obsolete - you need to:

    1. Check the same checkbox on the Replicate Global Extension template on the LogManager Setup tab.
    2. Clear the checkbox on this code template.
  6. Syntax Error: Field not found: Test_for_the_local_directory_is_obsolete_here__Click_Edit_Errors

    This means that you have the 'Test for the local directory' checked in this Code template. This option is obsolete - you need to:

    1. Check the same checkbox on the Replicate Global Extension template on the LogManager Setup tab.
    2. Clear the checkbox on this code template.
  7. Syntax Error: ThisRep.FM3Upgrading &= (ds_PassHandleForUpgrading())

    Replicate now is bypassed during FM3 upgrading (except for autonumbering). If you have an old version of FM3/2 - this will not be supported. You need to download the latest version of FM2/3.
  8. Warning: - Label duplicated, second used: _NoReplicate_

    In version 1.49 the project define was changed to being a global equate. You need to remove the _NoReplicate_ project define from your project defines.
  9. Syntax error: Field not found: Active, User, Machine, NoLogging, etc.

    For field not found (LogOff, Inside, Inited and TempOpened, you need to check out the U.8. What do I do when upgrading from a Replicate version prior to 1.98 beta? section of this doc for details on what to change)

    For field not found (Active, User, Machine, NoLogging, TranslationFile, CanConstruct, FM3Upgrading, FM3SettingVar, LogFileNumber, LogFileDate, LogFileTime, LogFileName, Silent, SuppressWarnings, SuppressCritical) you need to check out the U.7. What do I do when upgrading from a Replicate version prior to 1.69 beta? section of this doc for details on what to change)
  10. Syntax error: Unknown identifier PRD:GUID

    Basically what's happening here is that Replicate is trying to register a table that has no GUID or GUIDKey. This means that either you want this table to be Replicated, in which case you need to add the GUID field and GUIDKey (as laid out in the What you need to change in your Dictionary section) or you don't want this table to be Replicated (in which case you need to follow the steps to excluding a file from replication UT - Suppressing files in the dictionary (as apposed to in each app))

    If you are still receiving these compile errors, then you are trying to suppress a table which should not be suppressed because of relational integrity. IOW you have a table that is being replicated, and one of it's relating tables (1:many) that has a constraint on it (restrict update and/or restrict delete) you are wanting to suppress. This is an illegal suppression, so you need to either:

    1. Suppress the table with the 1:many relationship.
    2. allow replication for the constrained table.
    3. remove the constraint from the table relationship (i.e. make the changes/deletes cascading or do nothing rather than restrict).
  11. (<program>.map 1,1) Link Error: Unresolved External $REPGLO:SITE in <program>,obj

    RepGLO:Site is part of a global variable "table" in your dictionary (that Replicate requires). You've checked the 'Generate template global data as EXTERNAL' checkbox (on your application's Global Properties) - since the RepGLO:Site is a global table variable, it's adding the EXTERNAL attribute to the variable - and looking for it elsewhere (while this is probably the data DLL or a single EXE application).
  12. Syntax error: Illegal reference assignment or equivalence

    SELF.File &= Parent or if pFile &= Parent   (and other lines)

    You have a parent file in your dictionary - and your application is a legacy application. Because Replicate is a set of classes, parent is a reserved word when using OOP. You will need to rename the parent file in your dictionary to an ABC/OOP compliant class name.

Debugging Replicate

Replicate makes use of the API command OutputDebugString to pass debug information to a debug trapper - like Mark Russinovich's DebugView (www.sysinternals.com), which I recommend, although cannot guarantee or be held responsible for any losses, damages, etc because of the use of that product.

To turn debugging on, simply set the property:

ThisRep.SetGlobal
Setting('DebugMode',1)

Then you need to use one of the following command line parameters in order to debug the section that you require to debug:

/RepDebugImport    !Debug anything pertaining to the importing of logfiles in the LogManager
/RepDebugExport   
!Debug anything pertaining to the exporting of logfiles in the LogManager
/RepDebugLogging   
!Debugs anything to do with logging (file changes) in both the LogManager and your application
/RepDebugSetup   
!Debugs the Site setup (configuration and initialisation)
/RepDebugFileHandling   
!Debugs the logfile moving (in & and out of directories - compression

Adding A Parameter To Shortcut

Note: If there are quotes around the exe name, then you need to tag the parameter after the quotes (with a space between the quotes and the command line parameter as well).

You can add the subscript (All) to the parameters to obtain more detail.

The information passed to the debugger will contain the following:

Replicate(<[SiteID]>): [Replicate_Method_Name]: <[Procedure_Name]>: <[Error & Errorcode]>: <[Variables & Values]>

Support

To save you time (with the delay of waiting for reply emails), we suggest checking the FAQs. These are regularly updated and contain answers for most questions.
Your questions, comments and suggestions are welcome. Check our web page (www.capesoft.com) for new versions. You can also contact us in one of the following ways.
CapeSoft Support
Email
Telephone +27 87 828 0123
Fax +27 21 715 2535
Post PO Box 511, Plumstead, 7801, Cape Town, South Africa