Sunday, September 7, 2014

Macros in Dynamics GP a Primer for Auditors and Compliance Officers

Definition - Macro:  A set of instructions, which tell a computer to complete a specific task or tasks, typically repetitive tasks, with a well defined data set. 

Dynamics GP (Great Plains) has a tool, which enables users to record and play back macros.  Once a macro is initially recorded, a Word “mail merge” can be run against the recorded macro to map/provide a data set to speed repetitive data entry.  The macro can then be “played” back in the system, and the macro will enter the merged data into Great Plains, as if it were a user.

The advantages of this are obvious.  First, a macro can perform tasks much more rapidly than a human.  Second, a macro can perform tasks with greater accuracy than a human.  Finally, there are instances where tasks must be run during off-peak hours, where a macro can substantially automate these tasks.

Important concepts

Macros make use of the same security structure Great Plains users utilize – a macro cannot run without Great Plains being open, and a user logged into the system.  If this user does not have access to a particular window in Great Plains, and this window is called by the macro, then the macro cannot run.

Macros must obey business logic and workflow – a macro cannot overcome business logic constraints or damage referential integrity in Great Plains (e.g. it cannot do anything a user cannot do).  For instance, if you want to import an invoice for a particular customer, the customer must first exist in Great Plains.  Similarly, if you would like to DELETE a customer from the system, and this customer has existing transactions, Great Plains would not allow the customer to be deleted (by a user or a macro).

Macros are persnickety – if differences in macro data require different workflow in the system a macro will periodically fail as the data is processed.  For instance, if a dialogue box appears when various conditions are met; such as, when there is not enough quantity on hand to cover an inventory transfer from the referenced location, and only some locations meet this criteria, the macro will be interrupted every time this dialogue box appears.  

Consequently, it is incumbent upon macro designers and testers to uncover these issues, and divide a macro into “bite size” pieces to be processed in like data groups.

Risk and Mitigating Controls
Macros can be roughly grouped into three separate categories, which have vastly different levels of risk associated with them.  First, macros can be used to import transactions, which must be reviewed, edited and posted by users.  Second, macros can be used to create or edit master records in the system (i.e. customers, vendors, items, etc.).  Third, macros can be used to automate processes in the system.

The first category, macros used to import transactions requiring further processing, is as low risk as everyday processing in Great Plains by users.  Transactions are entered into the system, as if a user typed them.  Reports are printed and reviewed by the appropriate business owners.  After review, these transactions are either edited as necessary, deleted or posted.  Any mistakes made during macro processing in category one would closely mirror mistakes made in normal daily processing in Great Plains.

The second category, macros which create or edit master records in the system, pose greater risk than category one macros, simply because there is no review step inherent in the process.  The data is changed immediately when the macro is run; therefore, it is incumbent upon the designers and testers of the macro to ensure a higher level of care and data integrity during the design and testing phase (prior to deployment).

The third category, automating a task, risk level depends greatly on the task being automated.  For instance, we currently use macros to substantially automate the tasks associated with the transaction imports into Great Plains from other systems. These activities typically take place in the middle of the night, for a number of practical reasons - system performance being the most important.

In these cases automation poses the least amount of risk, because it has been designed and tested, and only the data imported changes each night.  The biggest risk to a business is posed by changes which might deleteriously impact this integration (i.e. not all information would be interfaced).  These risks can be substantially mitigated by a reconciliation process between Great Plains and the external system.

A word about macros and risk.  A macro can enter with 100% accuracy, thousands of fields of data in several minutes.  Humans cannot.  One study conducted by UPS, quotes a statistic; during data entry, a typical user commits a keystroke error every 300 keystrokes.   In my opinion, the rewards of using macros properly, far outweigh the risks associated with their use.

We recently used macros to enter the variances noted during a physical inventory – I cannot easily estimate the amount of time saved by performing the process using macros, but if I had to hazard an educated guess, it would be on the order of days, not hours.  Additionally, the level of confidence I have in the accuracy of the data imported into the system is much greater than if it were hand typed by teams of people.


Finally, macros are a tool.  Like any tool, it is incumbent upon the user to use it wisely, and treat it with the respect it deserves.   Macro use should include a review process, which includes a QA step prior to deployment.  This approach reduces the likelihood of macro-related errors substantially.

No comments:

Post a Comment