Workflow automation – script or code replacement

In most organizations, systems and processes have grown organically. Over a period of time, trained IT experts have implemented manual scripts and “workarounds” to automate manual archiving tasks that were previously time- and resource-intensive. Such workarounds would have been welcome in your various environments at the time of deployment, but with ever-changing business environments, they can become increasingly difficult to track, implement, and integrate. New systems and infrastructure can mean new scripts, different ways of working, driven by efficiency savings, or regulatory compliance can make original scripts out of date. New scheduling standards, skilled staff turnover, and mergers and acquisitions can expose an organization to high costs outside of data systems. Will the new systems, personnel, processes and programming integrate with the old ones? As experts in managed file transfer solutions and automated workflow engines, we believe it is important not to let your dog’s tail wag.

We’ve provided two simple examples that stand out on their own to make our case for a coordinated approach when it comes to workflow automation for your file transfer needs.

Here is an example of a rudimentary script for setting up an automated file transfer between a host and a client. This script transfers a file from the client to the server and vice versa. The script records the command and returns values ​​to a file.

#! / bin / bash

DATE = `date +% d.% M.% Y-% H.% M`
SRV = sftexa

# scpg3 put
echo “/ opt / xxxxx / bin / scpg3 -B -q testfile $ SRV: test” >> scpg3_put_ $ DATE
/ opt / xxxxx / bin / scpg3 -B -q testfile.dat $ SRV: test
echo $? >> scpg3_put_ $ DATE

# scpg3 get
echo “/ opt / xxxxx / bin / scpg3 -B -q $ SRV: test test” >> scpg3_get_ $ DATE
/ opt / xxxxx / bin / scpg3 -B -q $ SRV: test test
echo $? 0 >> scpg3_get_ $ DATE

This is how an automated MFT workflow solution would do it.

1. Select the source folder
2. Select the destination folder
3. Select File (s) by name / type / size / date, etc.
4. Programming the transfer frequency
5. Save task

Below is an example of a script to extract files starting with “abc” from an FTP server. Then the file names are exported to a csv file.

ftp.Hostname = “ftp.test.com”
ftp.Username = “User”
ftp.Password = “Approve”
ftp.Passive = 1
ftp.AuthTls = 1
ftp.Ssl = 0
connectStatus = ftp.Connect ()
If (connectStatus 1) Then
‘MsgBox ftp.LastErrorText
‘WScript.Quit
Principal = DTSTaskExecResult_Failure
The rest
dirStatus = ftp.ChangeRemoteDir (“RMed”)
If (dirStatus 1) Then
‘MsgBox ftp.LastErrorText
‘WScript.Quit
Principal = DTSTaskExecResult_Failure
It will end if

‘MsgBox ftp.getCurrentRemoteDir ()
localInvoiceFile = “C: Documents and settings My documents2dRMedInvoice_RMed.txt”
localPatientFile = “C: Documents and settings My documents2dRMedPatient_RMed.txt”
remoteInvoiceFile = “C: ClientsStratFile UploadsCustomer UploadsRMedInvoice_RMed.txt”
remotePatientFile = “C: ClientsStratFile UploadsCustomer UploadsRMedPatient_RMed.txt”
‘MsgBox remoteInvoiceFile
‘MsgBox remotePatientFile

transferStatus = ftp.getFile (remoteInvoiceFile, localInvoiceFile)
if (transferStatus 1) then
‘MsgBox ftp.LastErrorText
Principal = DTSTaskExecResult_Failure
the rest
Principal = DTSTaskExecResult_Success
it will end if

transferStatus = ftp.GetFile (remotePatientFile, localPatientFile)
if (transferStatus 1) then
‘MsgBox ftp.LastErrorText
Principal = DTSTaskExecResult_Failure
the rest
Principal = DTSTaskExecResult_Success
it will end if

IT WILL END IF
ftp.Disconnect

Final function

Here’s how an automated MFT workflow solution would do it:

1. Select the source folder
2. Select Files by name abc *. *
3. Select the destination folder
4. Export file names to.csv
5. Schedule the frequency of tasks
6. Save task

These two examples are for simple tasks. These two tasks do not have any security or encryption elements written to them, which would be standard for an MFT automated workflow solution. There will be savings and improvements that your business can make, and if you take a brief look at the two examples, you will know where these areas are. You may have reservations due to the complexity of the scripts you currently employ, however, rest assured that mature automated workflow solutions don’t just perform simple tasks. They typically have a rich and configurable set of pre-set workflows to cover most of your file transfer needs, and these can be customized with minimal training and no programming experience. Even for those small complicated processes where you need something that can be further refined, a good workflow engine will have modules and APIs that allow greater access to programming.

It’s ironic to manually schedule automated processes; Automated workflow engines just don’t see it.

Leave a Reply

Your email address will not be published. Required fields are marked *