Explore Categories

 

 PDF

Ensuring the Performance

Code development is one aspect of a Project. However, the performance of the output is equally necessary to handle. While writing the code, we need to optimise the code, ensuring maximum speed. Time is a more valuable commodity than anything!

We are aware that by using TDL, there are various customised solutions available to simplify the complicated business requirements. If the performance of the solution is poor, it cannot be used. Hence, the performance has to be assured while developing a solution. The developer should ensure the following:

  • The correct code is written at the right places to achieve maximum speed. Refer to Choosing the Right Approach
  • Modern TDL artefacts are used to achieve maximum speed.
  • Make use of latest TDL capabilites for optimum performance.

Attribute ‘Client Only’ in Collection

This attribute is used to collect data from the collection locally and not send a Server’s request for data collection. Whenever the Data needs to be gathered from external Data Sources like XML, DLL, ODBC Collection. In cases, the ‘Client Only’ Attribute needs to be set to ‘Yes’ as we do not need the request to be sent to the Server. The attribute plays a vital role when the Data is accessed through Remote Access. When this attribute is set to yes, the collection shall be evaluated for a value at the client end. 

Note:  The ‘Client Only’ attribute should not be set to ‘Yes’ when its associated objects are not available at Server.

Attribute ‘

As we already know, the ‘Collection’ Artefact evaluates the various attributes either during initialisation or at the time of gathering the collection. It may require various inputs from the Requestor context for the same. For example, the evaluation of ‘Child of’ and ‘Filter’ attributes happens when gathering the collection. In a Remote Environment; where the Requestor Context is not available within the Server side’s collection. In order to overcome the above, a new Collection attribute, ‘Parm Var’ has been introduced. It is a context-free structure available within the collection. The requestors Object context is available for evaluation of its value. The attribute is evaluated only once in the context of the caller/requestor. The attribute enacts collection initialisation. The provided expression is evaluated and stored as a variable. The stored value can be referred to within any of the collection attributes anytime. The value is made available at the Server end by sending it with the XML Request.

Attribute ‘

The attribute ‘Search Key’ is used to create an index dynamically where the TDL programmer can define the Key, and the Collection is indexed in the memory using the Key. Once the index is created, any object in the collection can be instantly searched without needing a scan, as in a filter. Search Key is Case Sensitive.

Note: This attribute has to be used in conjunction with a function $$CollectionFieldByKey. This function basically maps the Objects at the run time, with the Search Keys defined at the Collection.

Function – $$ReportObject

The function $$ReportObject evaluates the given expression in the Data Object context associated with the Report Interface Object. One of the essential Use Cases of $$ReportObject is its usage in the purview of in-memory Collection gathering. Whenever a collection is gathered, it is retained in memory with the Data Object of the current Interface (Requestor) Object. Suppose the same collection is being used in expressions again and again. In that case, it is beneficial from the performance point of view to attach it to the ‘Report’ Object and evaluate it in the context of ‘Report’ Object n number of times. This eliminates the need to re-gather the collection every time in the context of other Data Objects.

Function – $$CollectionFieldByKey

The function is used to retrieve the values from a collection, based on the Search key specified, is $$CollectionFieldByKey. This capability is quite helpful in matrix reports, i.e., when two or more dimensions need to be represented as rows and columns. In such a case, defining the search key on a method combination and using $$CollectionFieldByKey for value retrieval improves performance.

Note: This attribute has to be used in conjunction with the attribute Search Key. This function basically maps the Objects at the run time, with the Search Keys defined at the Collection.

Function – $$ExclEvaluate

When prefixed to an expression, the function helps in evaluating it without establishing the link with the UI elements. There may be a few cases where the programmer would not want the system to establish the relationship between the caller and the object being accessed, to refresh the subsequent modification value. In such cases, prefixing $$ExclEvaluate would indicate the same to the system.

Writing Remote Compliant TDL Reports

TDL developers can optimize the performance of the Remote compliant TDL by minimizing the server request calls. Mentioned below are the guidelines to optimize the Remote Compliant TDL Reports.

Fetching a single Object

When an entire Report requires multiple methods of a single Object, then the Object can be pre-fetched with the required methods. In this approach, only one server call is made to fetch all the required methods. We can make use of the attribute ‘Pre-Fetch Object’ at Report Definition.

Repeating Lines over a Collection

The following techniques are used to optimize the performance when a line is repeated over a collection in a report to be displayed on the client.

  • Fetching the collections: It is important to fetch all the required collections at the ‘Report’ Definition. We can use the attribute ‘Fetch Collection’. 
  • Fetching the Methods: It is mandatory to specify ‘Fetch’ in the Collection for all the methods which are used in the fields at the Report. If ‘Fetch’ is not used, then the data will not be displayed in the field.
  • Function inside the ‘Repeat’: When Lines are repeated over a Collection and a function is used at the field level, then each ‘Repeat’ will trigger an additional server request for function call. In this scenario, the entire function call logic can be moved to ‘Compute’ of the repeated Collection. The later approach will do only one server request. Hence, performance is drastically improved.

Using the same Collection in more than one Report

When more than one Report requires different methods of the Objects of the same Collection, then using the same collection with all the methods fetched in it reduces the performance. This can be improved in the following ways:

  • Fetching the required methods locally at Report: For example, Report-1 requires Opening Balance of a Ledger whereas Report-2 requires Closing Balance. Instead of modifying the Collection to fetch both Opening Balance and Closing Balance, the same is localized in respective Reports.
  • Separate Collections for fetching different methods: Two different Collections can be created for fetching the opening balance and the closing balance and use it in Report1 and Report-2 respectively.

Action ‘

The need for supporting the ZIP format in Tally has been felt in various offline Integration projects. Data Exchange takes place between branches and their Head Offices, Distributors and the Principal Companies, etc. The Head Offices/ Principal Companies having Tally or any other ERP would require the data from Branches/ Distributors for business performance visibility. Usually, Principal Companies require the Item-wise Sales Information of the distributors, which helps them plan their Stocks. Head Offices/ Principal Companies generally get Tally installed at Branches/ Distributors’ locations for integration purposes. The day-to-day Transactions like Sales, Purchase Orders, etc., are then exported from Tally and integrated by copying the appropriate XML files to FTP, which the Head Offices/ Principal Companies consume. At locations where the volume of transactions is large, the XML File becomes too bulky to upload to FTP. Subsequently, downloading from FTP takes a long time, thereby causing performance issues. Hence, zipping the file before uploading it to FTP makes the task quicker.

Note:
When you are debugging the code for performance always check on test data. Troubleshooting the performance with a huge volume of data will consume lots of time while loading the company and report. For example, if there are 1 lakh vouchers, try to improve performance with the company that has only 1000 vouchers, then take 10,000 vouchers and then take 1 lakh vouchers.

When it is taking longer to open a report observe the time taken for loading the source collections used in the report. You may load the source collection in the calculator panel. The report generation time should be almost equal to the time that is taken for loading the source collections.
TallyHelpwhatsAppbanner
Is this information useful?
YesNo
Helpful?
/* */