# Requirements

Every application design has requirements, and before starting a new project, or convert an old project you're going to want to define what those requirements are.&#x20;

This is different for every organization and every process so in this walkthrough we'll take a few common examples and show how we might architect those scenarios.&#x20;

## Common Requirements

### We need an API

Whether JSON, plaintext, xml, or file content, the built in .NET hosting with Perigee has you covered. Check out the blueprint here:

{% content-ref url="../blueprints/perigee-with-.net-hosting" %}
[perigee-with-.net-hosting](https://docs.perigee.software/blueprints/perigee-with-.net-hosting)
{% endcontent-ref %}

### We have an important mission critical daily process that needs to succeed.

For this type of process we have several options.&#x20;

* The first option is our [Sync Agent](https://docs.perigee.software/core-modules/event-sources/scheduled-logic/sync-agent):
  * This handy agent is highly configurable and can use everything from recurring timers, CRON strings, blackout periods, behavior trees, inter agent communications and more.
  * You the designer have complete control over the system and how/when it executes the processes
  * They can be easily turned on or off with the SDK.
  * They can communicate with other defined agents and use them as dependencies to execute.
  * They know how to execute behavior trees to determine run conditions
* The second option is a simple [CRON method](https://docs.perigee.software/core-modules/event-sources/scheduled-logic/cron-thread):
  * CRON methods are very lightweight and they're only job is to fire on on time.
  * All of the implementation details, retry logic, and execution is up to you.
* The third option is the [Transaction Coordinator](https://docs.perigee.software/core-modules/integration-utilities/transaction-coordinator):
  * The coordinator is highly fault tolerant and is typically used with a queue of requests, however they can be used to perform a singular daily task with very high tolerance.
  * The coordinator tracks EVERY step of the process from initialization data to final completion. It stores a highly compressed file at every step of the process and can resume locally and remotely when the application or server is stopped and restarted.&#x20;
  * If you need to replay the same transaction later, you're able to inherit the initial request and replay it again later

### We use Salesforce and need to trigger a process on new records

If you need direct API access or to use push topics and have data streamed to your application, we have that covered here:

{% content-ref url="../core-modules/event-sources/watchers/salesforce" %}
[salesforce](https://docs.perigee.software/core-modules/event-sources/watchers/salesforce)
{% endcontent-ref %}

### We have CSV files dropped in a folder every day and need to import that data

Use the [DirectoryWatcher](https://docs.perigee.software/core-modules/event-sources/watchers/directory-watch):

* It checks file locks automatically
* Verifies file size over time to prevent callbacks on actively writing content
* Asynchronous file operations
* Options for failures

### We have scheduled reports that need to be run

Use the built in [Scheduler](https://docs.perigee.software/core-modules/event-sources/scheduled-logic/scheduler):

* It allows for individual rows in a database to control when a function is in code executed.&#x20;
* It tracks changes to the rows in real time and will adjust scheduled tasks accordingly when the rules change.
* Accepts a CRON string, a timezone, and a runtime type with arguments.&#x20;

### We need to process customer orders

{% hint style="info" %}
Typically speaking this kind of logic is implemented on the receiving end of a message queue. If you use ActiveMQ, RabbitMQ, Kafka, or anything else, you would end up with a message that has an initial payload (let's say JSON to make life easy).
{% endhint %}

{% hint style="info" %}
An  "order process" typically communicates to a downstream service via REST API calls. In this example, we will be doing the same thing
{% endhint %}

This is where the [Transaction Coordinator](https://docs.perigee.software/core-modules/integration-utilities/transaction-coordinator) comes in.

* The coordinator is highly fault tolerant and has a built in multi-threaded processor to handle a defined number of queued transactions using the[ FIFO ruleset](https://en.wikipedia.org/wiki/FIFO_\(computing_and_electronics\)).&#x20;
* The coordinator tracks EVERY step of the process from the initial JSON payload to every single operation that is performed afterwards.
* Immediately upon queuing the transaction, the system can persist that item to disk and it is tracked and maintained in a highly compressed format that is updated as the transaction progresses.
* The Coordinator tracks the responses, requests, timing and codes as part of this transactional process automatically. No more pawing through logs trying to figure out what happened.&#x20;
* Retry logic, automatic transaction requeuing and a rich SDK to fine tune the steps are all available to you.
* If the downstream processor loses a day worth of data to faulty server, you're able able to simply replay an entire transaction process automatically by inheriting the initial JSON payload and walking through the same steps.&#x20;
* NEVER lose an order again.

### We need to import customer created Excel files

Maybe start with the API template if you want to accept a file through an API call:

{% content-ref url="../blueprints/perigee-with-.net-hosting" %}
[perigee-with-.net-hosting](https://docs.perigee.software/blueprints/perigee-with-.net-hosting)
{% endcontent-ref %}

Then let's use the [ExcelReader ](https://docs.perigee.software/core-modules/file-formats/excel)class after we get the file posted to us:

* It's very fault tolerant, especially with customer created data.
* It uses very advanced data detection algorithms to scan for the actual table within a tab which can bypass headers and notes left by a user.

### We need to synchronize data from a third party with a DeltaKey

{% hint style="info" %}
A **DeltaKey**, sometimes called a **NextToken** or **Offset** are all ways of a third party API enabling multiple data retrieval calls by only sending back the new or modified records after the last time it was called.
{% endhint %}

We want to use three different methods to make this process easy:

1. Let's create a new [Watermark ](https://docs.perigee.software/core-modules/integration-utilities/watermarking)that stores our **DeltaKey**.
   * This will automatically sync this delta key to the local disk, and restore it upon application restart.
   * It has hooks to initialize the source, and provides event handlers when the data is updated
   * It's very easy to synchronize the watermark to a remote source, like a database or the cloud.
2. Let's register our [RestSharp credentials](https://docs.perigee.software/core-modules/credential-management/restsharp-authenticator) so we don't have to worry about authentication issues:
   1. This takes care of the reauthoring the token and making sure it's ready to authorize when needed.
   2. Anytime we create a new <mark style="color:blue;">**RestClient**</mark>, we supply this to the `Authenticator` of the `RestClientOptions`.
3. Let's use a [CRON scheduled thread](https://docs.perigee.software/core-modules/event-sources/scheduled-logic/cron-thread) to perform this operation on our schedule.
   1. We now have the **DeltaKey** stored, maintained and refreshed on application load and shutdown.
   2. We have the authentication token and parameters taken care of by the RestSharp Authenticators.&#x20;
   3. Run the logic:
      * Simply call the third party with the token and auth ->&#x20;
      * Process any data if it exists ->&#x20;
      * If you processed data, update the watermark ->&#x20;
      * Let the CRON fire again and repeat.
