Requirements
Every application design has requirements, and before starting a new project, or convert an old project you're going to want to define what those requirements are.
This is different for every organization and every process so in this walkthrough we'll take a few common examples and show how we might architect those scenarios.
Common Requirements
We need an API
Whether JSON, plaintext, xml, or file content, the built in .NET hosting with Perigee has you covered. Check out the blueprint here:
Perigee With .NET HostingWe have an important mission critical daily process that needs to succeed.
For this type of process we have several options.
The first option is our Sync Agent:
This handy agent is highly configurable and can use everything from recurring timers, CRON strings, blackout periods, behavior trees, inter agent communications and more.
You the designer have complete control over the system and how/when it executes the processes
They can be easily turned on or off with the SDK.
They can communicate with other defined agents and use them as dependencies to execute.
They know how to execute behavior trees to determine run conditions
The second option is a simple CRON method:
CRON methods are very lightweight and they're only job is to fire on on time.
All of the implementation details, retry logic, and execution is up to you.
The third option is the Transaction Coordinator:
The coordinator is highly fault tolerant and is typically used with a queue of requests, however they can be used to perform a singular daily task with very high tolerance.
The coordinator tracks EVERY step of the process from initialization data to final completion. It stores a highly compressed file at every step of the process and can resume locally and remotely when the application or server is stopped and restarted.
If you need to replay the same transaction later, you're able to inherit the initial request and replay it again later
We use Salesforce and need to trigger a process on new records
If you need direct API access or to use push topics and have data streamed to your application, we have that covered here:
SalesForceWe have CSV files dropped in a folder every day and need to import that data
Use the DirectoryWatcher:
It checks file locks automatically
Verifies file size over time to prevent callbacks on actively writing content
Asynchronous file operations
Options for failures
We have scheduled reports that need to be run
Use the built in Scheduler:
It allows for individual rows in a database to control when a function is in code executed.
It tracks changes to the rows in real time and will adjust scheduled tasks accordingly when the rules change.
Accepts a CRON string, a timezone, and a runtime type with arguments.
We need to process customer orders
Typically speaking this kind of logic is implemented on the receiving end of a message queue. If you use ActiveMQ, RabbitMQ, Kafka, or anything else, you would end up with a message that has an initial payload (let's say JSON to make life easy).
An "order process" typically communicates to a downstream service via REST API calls. In this example, we will be doing the same thing
This is where the Transaction Coordinator comes in.
The coordinator is highly fault tolerant and has a built in multi-threaded processor to handle a defined number of queued transactions using the FIFO ruleset.
The coordinator tracks EVERY step of the process from the initial JSON payload to every single operation that is performed afterwards.
Immediately upon queuing the transaction, the system can persist that item to disk and it is tracked and maintained in a highly compressed format that is updated as the transaction progresses.
The Coordinator tracks the responses, requests, timing and codes as part of this transactional process automatically. No more pawing through logs trying to figure out what happened.
Retry logic, automatic transaction requeuing and a rich SDK to fine tune the steps are all available to you.
If the downstream processor loses a day worth of data to faulty server, you're able able to simply replay an entire transaction process automatically by inheriting the initial JSON payload and walking through the same steps.
NEVER lose an order again.
We need to import customer created Excel files
Maybe start with the API template if you want to accept a file through an API call:
Perigee With .NET HostingThen let's use the ExcelReader class after we get the file posted to us:
It's very fault tolerant, especially with customer created data.
It uses very advanced data detection algorithms to scan for the actual table within a tab which can bypass headers and notes left by a user.
We need to synchronize data from a third party with a DeltaKey
A DeltaKey, sometimes called a NextToken or Offset are all ways of a third party API enabling multiple data retrieval calls by only sending back the new or modified records after the last time it was called.
We want to use three different methods to make this process easy:
Let's create a new Watermark that stores our DeltaKey.
This will automatically sync this delta key to the local disk, and restore it upon application restart.
It has hooks to initialize the source, and provides event handlers when the data is updated
It's very easy to synchronize the watermark to a remote source, like a database or the cloud.
Let's register our RestSharp credentials so we don't have to worry about authentication issues:
This takes care of the reauthoring the token and making sure it's ready to authorize when needed.
Anytime we create a new RestClient, we supply this to the
.UseAuthenticator()
method.
Let's use a CRON scheduled thread to perform this operation on our schedule.
We now have the DeltaKey stored, maintained and refreshed on application load and shutdown.
We have the authentication token and parameters taken care of by the RestSharp Authenticators.
Run the logic:
Simply call the third party with the token and auth ->
Process any data if it exists ->
If you processed data, update the watermark ->
Let the CRON fire again and repeat.
Last updated