Tidal has a great Interface that is user-friendly and easy to use. It automates loads of tasks and complex workflows. The solution has increased efficiency and decreased manual interventions, which results in accuracy and productivity. One of the outstanding features is its flexibility; we can automate tasks of all sizes and complexity. Tidal integrates with other third-party systems, which makes it easy to connect and exchange data. Alert and notification features enable the user to know the status of the task.
I am a part of the sales operations team which deals with the pricing of products, getting the pricing set-up up in the system, collecting sales data from sales teams from various geographies, converting the files to a consumable format of Excel, getting the data uploaded on to the database, connect the database to reporting and data visualization tools like Power BI which helps us build reports, dashboards, and analyze business trends. All of these tasks include a lot of manual processes that take a lot of time and effort.
Professional system administrator at DXC Technology
Real User
Top 10
2023-04-26T20:11:00Z
Apr 26, 2023
Tidal Automation was widely used for alerts, notifications, and analysis. As we handle servers in which the application will be running, we used to get alerts of incidents if there was any problem or issue with an application running or if there was an OS issue. Everything was addressed and worked on in a timely manner with the help of Tidal. We also had to analyze the server performance over and over to improve the stability. For that, Tidal Automation was very useful and it reduced the manual intervention in big lengthy tasks.
The solution is used to automate and manage intricate and crucial business workflows across numerous systems and applications is the main use case for the Tidal workload automation software solution. The workload automation software from Tidal is especially helpful for sectors like banking, production, transportation, and healthcare that have high volumes of time-sensitive processes. Any delay or mistake in finishing tasks can have serious repercussions and affect the bottom line in these sectors. The software solution from tidal can assist organizations in achieving regulatory compliance and supports compliance requirements.
In our organization, we are running scheduled jobs or self-triggered jobs in batches iteratively. We have been using Tidal for all the primary purposes of development, testing, and production. Tidal has provided flexibility to run jobs in all these environments. In the production environment, we perform systems administration and the development of the Job scheduling environment. I configure Tidal, maintain it, deploy it, apply hotfixes, or perform any type of system admin function. In terms of deployment, we're on-premise.
The primary use case for Tidal Automation software will depend on the specific needs and goals of each organization. It includes tasks such as job scheduling, workload automation, and event-driven automation. It helps in batch processing, data transfers, and job scheduling. The software can also be used to integrate with a wide range of enterprise systems and applications, including ERP systems, databases, and messaging systems. Tidal Automation software improves the efficiency and accuracy of business processes, reduces errors and delays, and optimizes resource utilization.
Tidal Workload Automation Software is primarily used for scheduling, monitoring, and managing critical business and IT workflows across an organization's IT infrastructure. This software automates the execution of various workflows, including batch jobs, data transfers, file processing, and application integration, among others. The software provides a centralized platform for managing and automating crucial business processes such as report generation and customer service operations. The software can be used in a variety of industries, including finance, health care, manufacturing, etc.
Tidal Automation can handle batch processing task scheduling and implementation, decreasing errors and increasing productivity. Data handling and administration duties such as data merging, migration, and transformation can be automated using the program. It can also automate duties related to IT operations, such as software deployment, server upkeep, and backup and recovery. By automating procedures for data retention, audit records, and security controls, the software can help guarantee legal conformance. It can be used in a variety of sectors to help companies automate processes, increase efficiency, and reduce errors.
My primary use case for using Tidal Workload Automation Software is its ability to handle job scheduling and automation. The software allows organizations to schedule and automate tasks across multiple platforms, applications, and systems, which helps in reducing manual intervention and improves efficiency. It can automate millions of jobs at very ease without any downtime. The software manages the workload very effectively. It provides real-time visibility into workload status and ensures that resources are being used for high-priority tasks.
Tidal Automation uses advanced algorithms and machine learning techniques to analyze real-time data from tidal turbines and adjust their settings to maximize energy output while minimizing maintenance requirements. This can help increase the efficiency and reliability of tidal energy systems, leading to cost savings and improved environmental sustainability. Overall, the primary use case for Tidal Automation is to help manage and optimize tidal energy production in a variety of settings, ultimately contributing to the growth and success of the renewable energy industry.
The primary use case of Tidal Automation solutions is to automate and manage complex and time-consuming tasks associated with scheduling and reducing manual efforts. Tidal Automation solutions can streamline these tasks by automating data collection and analysis, scheduling maintenance tasks, and monitoring the performance of environments and the associated system. By automating repetitive and time-consuming tasks, Tidal Automation has helped us save time and resources, reduce errors, and improve operational efficiency. It was deployed on-premise as a SaaS application.
Scheduling Operations Engineer at a financial services firm with 5,001-10,000 employees
Real User
Top 20
2022-12-01T21:46:00Z
Dec 1, 2022
We use it in our production environment. We use it to schedule and execute many jobs. It is used by multiple application teams within our organization, such as SQL, Unix, ETL platform, MFT, and our AWS team. Other application teams include front office, back office, and accounting. They all use the Tidal environment.
We use Tidal to automate all the jobs within our IT applications, especially for our ERP, which is JD Edwards, as well as Oracle, and Microsoft. Currently, we execute 12,000 jobs per day through the platform.
Batch Production Manager at a consultancy with 201-500 employees
Real User
2022-10-07T20:53:00Z
Oct 7, 2022
In our organization, we are running scheduled jobs in batches. There are three different uses for Tidal, which are primarily development, testing, and production. It's like the environment that allows you to run batch jobs in those three environments. I do production, like systems administration, and the development of the batch environment. So, I install Tidal, maintain it, upgrade it, apply hotfixes, or do any type of system admin function. Also, I set up batch environments for development, QA, and production which run in Tidal. I also train people on how to use it. In terms of deployment, we're on-prem. We've adapters and virtual machines, but everything is on-prem.
Tidal is used for workload automation, batch jobs. It lets us run financial jobs, warehouse replenishment jobs, and reporting jobs across multiple applications, such as SAP, warehouse management systems, as well as our auditing system. We run something like 11,000 jobs every day across our enterprise. The types of application middleware that we use it to automate include Azure Data Factory, data warehouse jobs, Azure Analysis Services jobs, our PDM database system, as well as our warehouse system that handles product reordering, picking and packing, and e-com orders.
We have multiple ERP systems, and we use it to schedule all of our jobs in the ERP systems, specifically JD Edwards. We also have a lot of integrations using FTP file events to move files around, and we are also using the software to automate all of our manual stopping and starting of services and patching of systems.
Head of Global Middleware Platforms at a pharma/biotech company with 10,001+ employees
Real User
2022-06-06T06:36:00Z
Jun 6, 2022
Our largest use cases are for the execution of SAP and JD Edwards jobs. Then, there are a lot of other technologies, however, in terms of the Pareto principle, really that's the bulk of our processing. SAP is what we use for our manufacturing and operational type stuff with the actual products. JD Edwards is a lot more of financial reporting and projections and things like that. We use the solution to run SAP and JD Edwards. Windows and Unix hosts are probably the second most common use case, as well as web services. Any REST API would apply - and we use a lot of REST API technology. Protocol, really.
Automation Manager at a financial services firm with 1,001-5,000 employees
Real User
2020-04-05T09:13:00Z
Apr 5, 2020
We use it for a host of standard/general stuff, like batch workflow automation, in the front and back offices. We have also centralized all of our SQL Server maintenance that is running on it. Instead of having SQL Server maintenance plans or jobs running on 300 or 400 disparate servers, we run them through Tidal so we have consolidated administration and reporting that feeds straight into ServiceNow. Last year, we made a step change with our DR recovery process. We had a bunch of people running manual scripts and different things where you have networks: Wintel, DBAs, or application support teams. They were running their own separate scripts to do application failover. This is different when it's active-active or active-passive replication. What we did was integrate it with different command line driven jobs, like PowerShell commands, to effectively failover applications and infrastructure into a sequenced set of dependant jobs. Therefore, if we need DR, we were not relying on a mix of SMEs saying, "Where was that script or how do we fail this over?" Instead we can just push a button and the thing fails over, which is beautiful. Additionally we do compliance reporting from within Tidal and like many people we are regulated from PWC. Everyone has the technology control frameworks that they have to evidence. Instead of people taking screenshots, we will effectively find out what information PWC need and build the job using CLI which runs on either month or quarter end. The job will go off, collect that evidence, come back, and be formatted. Then, we just drop it in SharePoint or use Tidal to save it to a file share, sending an email off to say, "Your evidence is collected. You need to review it, then sent it onto audit." We use it for a vast array of housekeeping jobs. It is not that Tidal is a monitoring tool, but automation is basically as far as your imagination can take you with anything that runs by a command line, which is virtually anything you can do. We previously had a use case for it to give us a quick alert for when some of our infrastructure became unavailable. We just had it running every minute. Typically, it's not an enterprise monitoring tool, but if you have some deficiencies or things that you need to enhance, or give a different sort of dimension to, we've used it for that in the past. We also run it against our infrastructure using PowerShell to pull a whole host of reporting from our infrastructure daily, which is useful. We use Tidal to run SQL Server and Windows. There is not really any Unix. Since we start using it, they do more stuff in AWS. They now have a whole bunch of different cloud capabilities. We are moving towards private cloud. We're in the sandbox at the moment.
Production Control Engineer at a healthcare company with 201-500 employees
Real User
2020-03-03T08:47:00Z
Mar 3, 2020
It's a company-wide batch scheduler. It runs tons for us. It runs Windows, Unix/Linux. We connect with a lot of databases: Oracle, SQL, Sybase. We have BusinessObjects BI adapters, we scan emails, and we incorporate it with TriZetto Facets healthcare solutions. There's so much. It's our core enterprise scheduler.
We use Tidal to run jobs across multiple application platforms, such as SAP, ECC, PDN, and Informatica, as well as jobs that run in Azure cloud. We also use it for several warehouse management jobs with OS/400 and AS/400 connectors. We have a lot of different types of connectors, then we are bringing all these jobs into Tidal so we can set up dependencies between jobs that run, e.g., an SAP job and a OS400 job may be dependent on each other in some way, allowing a cross-platform job flow. We are currently on the most recent version.
Production Control Analyst at a healthcare company with 1,001-5,000 employees
Real User
2020-02-09T12:23:00Z
Feb 9, 2020
I have three installs of Tidal: production, qual and dev. I have a portfolio of 12,000 unique job definitions in production, 13,500 definitions in qual, and about 8,000 in dev. The Tidal adapters I use are for Windows and Linux agents, as well as Informatica, Cognos, and mSQL.
We have a product called J.D. Edwards which is our ERP system. Our biggest use case for Tidal is to automate jobs that we submit through J.D. Edwards. Our second use case would be automating maintenance — stopping services, deleting logs — your "keeping the lights on" type of stuff. And our third use case is using it for any automation tasks that we come across. Tidal is our product of choice at the moment. If we're going to automate something, we're going to use Tidal to automate it. We integrate Tidal with Linux, Windows, iSeries, SQL Server, and Oracle, in addition to J.D. Edwards.
Lead Control Analyst at CENTRAL STATES SOUTHEAST & SOUTHWEST AREAS HEALTH & WELFARE F
Real User
2020-02-05T10:15:00Z
Feb 5, 2020
We use Tidal extensively to run our health and welfare claims processing throughout the day. That's the reason we got Tidal back in 2011. We receive 15,000 to 20,000 claims a day and we use Tidal to process the whole thing, all the way through to creating checks at the end of the day. Since 2011, we've expanded it to other applications and other processes: mostly reports, and files that come in electronically from other companies that feed other applications. And in a roundabout way, what we use Tidal for is to execute the applications to load whatever needs to be done on those applications. The transfer function we used to do with Tidal has been switched over to another software product called Cleo. And that is run by our network team. That way they can control all the information that comes in and out of our building. They can put secure FTP on it, encrypt and decrypt the information, and set password protections. Cleo has its own scheduler, like Tidal, but they don't use it. They let Tidal execute the Cleo commands to bring the data in and Tidal will execute any application programs after that. Overall we run 1,100 to 1,200 steps every day, depending on day of the week. I call them "steps," but they're actually multiple steps. Before you get to the actual processing of a program there might be a move, a copy, or a delete when we're clearing out folders, using DOS commands. We then move data around to certain directories so that either the TriZetto software that we use can find that data or any internal programs that we use in VBS, .NET, Oracle, or MS SQL stored procedures can find that data. We're also starting to use this new MDM application which captures addresses from various databases, verifies they are correct, and pulls them together into one database. After all of our nightly processing, we have Tidal kick off the main MDM master so that all those addresses are in sync. Tidal sits on its own database and then it talks, through agents, to the other applications.
We use it to call multiple source systems, such as Informatica workflows, Unix scripts, Windows scripts, PowerShell, batch files, and a few SAP web programs. We use it for certain file events and monitors. Tidal, by itself, can't monitor, but we create a script and job for that, then schedule it in Tidal. We use Tidal for multi-purposes. We use Tidal for our SQL Server, where we call from Tidal any procedures, statements, SQLs, or jobs. We also call a few HANA Stored Procedures from it. As of today, Tidal doesn't have an adapter, but we have some internal scripts which call Stored Procedures from Tidal. We run around 2000 to 3000 jobs per day. The infrastructure is in Azure.
Data Platforms Operations Lead Managed Hosting at a marketing services firm with 1,001-5,000 employees
Real User
2020-01-29T11:22:00Z
Jan 29, 2020
Our use of Tidal is mostly file-event driven. We use it to manage our ingestion, processing, and loading of data. Tidal has a hook and it runs ETL for us. It runs jobs and SQL and some of our database appliances like IIAS, the new version of Netezza Teradata. We have a file gateway that receives a file and drops it in a location. That file event picks it up and drops it over to the ETL tool. The ETL tool will run and aggregate a number of source files and turn it into a properly formatted input file. That file then goes through data hygiene and data analysis. Then it goes through a matching process. It is then put back out and runs an ETL process to stick it into a SQL database. And then there are a number of jobs that are run in the SQL database to manipulate that file. We don't have a lot of calendared events or scheduled windows. We have a central location for Tidal in our data center, and then we have client-hosted solutions where we run smaller instances of Tidal, and those are in the cloud. We use AWS, Azure, and GCP.
Sr System Engineer at a financial services firm with 5,001-10,000 employees
Real User
2020-01-29T11:22:00Z
Jan 29, 2020
We use it to manage our batch processing. For us, it came in as a replacement for a lot of different systems running crontab. In our case it's primarily for Unix/Linux systems that don't have their own mechanism for kicking off all these batch processes. It's the coordinator of all of our background processes and batch jobs that are running overnight and during the day. We use it to kick off custom Unix/Linux scripts that will launch our application processes. It's almost entirely Windows and Linux shell scripts that it's kicking off.
Team Lead at a manufacturing company with 10,001+ employees
Real User
2020-01-29T08:35:00Z
Jan 29, 2020
We use it primarily to run SAP jobs. While there is other minor stuff it runs in, 98 percent is SAP. We have a number of different types of SAP systems. There are different teams who are responsible for configuring, managing, and setting up jobs. They are the ones who define the jobs and schedule them. There is an administrative team who is responsible for maintaining the system landscape and providing training for Tidal. They also provide standards, guidance, guidelines, and jobs. We use the solution for cross-platform, cross-application workloads within SAP. Therefore, within SAP, we might run a job on one system, but wait for the job on other systems to finish first. That is our interdependency between SAP systems. However, we don't do things like run something on SAP, then go do something on a non-SAP system. We may have a bit of that, but that's not a big part of what we do. It's mostly within SAP systems or within an SAP system.
For most of the companies where I have put Tidal in, it runs everything. It does back office, handles trading, reporting of time, doing a lot of file transfers between vendors and regulatory bodies, etc. We use it to do a whole variety of things. File transfers are our most valuable use cases because those are the ones where we tend to have service level agreements and potential fines. Right now, we are just in a traditional installation with local servers. We use the solution from Hadoop and Workday and are not using adapters from them.
IT Vendor Manager at a manufacturing company with 5,001-10,000 employees
Real User
2020-01-27T06:39:00Z
Jan 27, 2020
We primarily use it for scheduling our JD Edwards ERP software batch jobs. The solution runs on Windows. It also integrates with our Unix & AIX systems. We use it for automating EDI transactions, so it reaches out to FTP sites as well.
Tidal Administrator at a retailer with 5,001-10,000 employees
Real User
2020-01-27T06:39:00Z
Jan 27, 2020
We're running jobs on a global scale. Being a global company, we're running scheduled jobs and ad hoc jobs across different regions. Jobs cover backend processing, financials, and the like. We're running on an SAP ERP system and we're also running Informatica for data warehouse. We're running BusinessObjects web reports as well as a lot of straight Windows and Unix command-line things. We run FTP processing, PGP encryption processing, and data services jobs. We're running about seven or eight of the different adapter types that Tidal has available. We have it on-prem. Both our test and production environments are on fault-tolerant setups.
Sr. Platform Engineer at Adobe Systems Incorporated
Real User
2020-01-23T14:08:00Z
Jan 23, 2020
We are mainly using it for triggering data jobs. It does a lot of ITIL stuff and data movement from systems into Hadoop. We use it because it has the capability of dependency triggering or dependency running. That's the main idea behind it. Also, it helps us to centralize and organize jobs across the organization. We use Tidal to run Hadoop backup system, SAP HANA, and SAP BusinessObjects. We also trigger a lot of jobs into SnapLogic, Salesforce, ServiceNow, Workday, and Tableau, along with a couple of dashboards. We run a couple of batches from our Unix and Windows machines: the stuff that the developers are working on and want to run in ITIL. But, SAP is the main thing. The main goal is to use Tidal for managing and monitoring cross-platform, cross-application workloads. The ability to manage those loads is what they do well. I can put a job to run in SAP, and once the job ends successfully, I can run that job in Hadoop. Or, I can run that job in Salesforce.
Tidal Software is a leading provider of enterprise workload automation solutions that orchestrate the execution of complex workflows across systems, applications and IT environments. With a comprehensive portfolio of products and services, Tidal optimizes mission-critical business processes, increases IT cost efficiencies and satisfies legal and regulatory compliance requirements. Hundreds of customers around the world count on Tidal for modernizing their workload automation and driving their...
Tidal has a great Interface that is user-friendly and easy to use. It automates loads of tasks and complex workflows. The solution has increased efficiency and decreased manual interventions, which results in accuracy and productivity. One of the outstanding features is its flexibility; we can automate tasks of all sizes and complexity. Tidal integrates with other third-party systems, which makes it easy to connect and exchange data. Alert and notification features enable the user to know the status of the task.
I am a part of the sales operations team which deals with the pricing of products, getting the pricing set-up up in the system, collecting sales data from sales teams from various geographies, converting the files to a consumable format of Excel, getting the data uploaded on to the database, connect the database to reporting and data visualization tools like Power BI which helps us build reports, dashboards, and analyze business trends. All of these tasks include a lot of manual processes that take a lot of time and effort.
Tidal Automation was widely used for alerts, notifications, and analysis. As we handle servers in which the application will be running, we used to get alerts of incidents if there was any problem or issue with an application running or if there was an OS issue. Everything was addressed and worked on in a timely manner with the help of Tidal. We also had to analyze the server performance over and over to improve the stability. For that, Tidal Automation was very useful and it reduced the manual intervention in big lengthy tasks.
The solution is used to automate and manage intricate and crucial business workflows across numerous systems and applications is the main use case for the Tidal workload automation software solution. The workload automation software from Tidal is especially helpful for sectors like banking, production, transportation, and healthcare that have high volumes of time-sensitive processes. Any delay or mistake in finishing tasks can have serious repercussions and affect the bottom line in these sectors. The software solution from tidal can assist organizations in achieving regulatory compliance and supports compliance requirements.
In our organization, we are running scheduled jobs or self-triggered jobs in batches iteratively. We have been using Tidal for all the primary purposes of development, testing, and production. Tidal has provided flexibility to run jobs in all these environments. In the production environment, we perform systems administration and the development of the Job scheduling environment. I configure Tidal, maintain it, deploy it, apply hotfixes, or perform any type of system admin function. In terms of deployment, we're on-premise.
The primary use case for Tidal Automation software will depend on the specific needs and goals of each organization. It includes tasks such as job scheduling, workload automation, and event-driven automation. It helps in batch processing, data transfers, and job scheduling. The software can also be used to integrate with a wide range of enterprise systems and applications, including ERP systems, databases, and messaging systems. Tidal Automation software improves the efficiency and accuracy of business processes, reduces errors and delays, and optimizes resource utilization.
Tidal Workload Automation Software is primarily used for scheduling, monitoring, and managing critical business and IT workflows across an organization's IT infrastructure. This software automates the execution of various workflows, including batch jobs, data transfers, file processing, and application integration, among others. The software provides a centralized platform for managing and automating crucial business processes such as report generation and customer service operations. The software can be used in a variety of industries, including finance, health care, manufacturing, etc.
Tidal Automation can handle batch processing task scheduling and implementation, decreasing errors and increasing productivity. Data handling and administration duties such as data merging, migration, and transformation can be automated using the program. It can also automate duties related to IT operations, such as software deployment, server upkeep, and backup and recovery. By automating procedures for data retention, audit records, and security controls, the software can help guarantee legal conformance. It can be used in a variety of sectors to help companies automate processes, increase efficiency, and reduce errors.
My primary use case for using Tidal Workload Automation Software is its ability to handle job scheduling and automation. The software allows organizations to schedule and automate tasks across multiple platforms, applications, and systems, which helps in reducing manual intervention and improves efficiency. It can automate millions of jobs at very ease without any downtime. The software manages the workload very effectively. It provides real-time visibility into workload status and ensures that resources are being used for high-priority tasks.
Tidal Automation uses advanced algorithms and machine learning techniques to analyze real-time data from tidal turbines and adjust their settings to maximize energy output while minimizing maintenance requirements. This can help increase the efficiency and reliability of tidal energy systems, leading to cost savings and improved environmental sustainability. Overall, the primary use case for Tidal Automation is to help manage and optimize tidal energy production in a variety of settings, ultimately contributing to the growth and success of the renewable energy industry.
The primary use case of Tidal Automation solutions is to automate and manage complex and time-consuming tasks associated with scheduling and reducing manual efforts. Tidal Automation solutions can streamline these tasks by automating data collection and analysis, scheduling maintenance tasks, and monitoring the performance of environments and the associated system. By automating repetitive and time-consuming tasks, Tidal Automation has helped us save time and resources, reduce errors, and improve operational efficiency. It was deployed on-premise as a SaaS application.
We use it in our production environment. We use it to schedule and execute many jobs. It is used by multiple application teams within our organization, such as SQL, Unix, ETL platform, MFT, and our AWS team. Other application teams include front office, back office, and accounting. They all use the Tidal environment.
We use Tidal to automate all the jobs within our IT applications, especially for our ERP, which is JD Edwards, as well as Oracle, and Microsoft. Currently, we execute 12,000 jobs per day through the platform.
In our organization, we are running scheduled jobs in batches. There are three different uses for Tidal, which are primarily development, testing, and production. It's like the environment that allows you to run batch jobs in those three environments. I do production, like systems administration, and the development of the batch environment. So, I install Tidal, maintain it, upgrade it, apply hotfixes, or do any type of system admin function. Also, I set up batch environments for development, QA, and production which run in Tidal. I also train people on how to use it. In terms of deployment, we're on-prem. We've adapters and virtual machines, but everything is on-prem.
Tidal is used for workload automation, batch jobs. It lets us run financial jobs, warehouse replenishment jobs, and reporting jobs across multiple applications, such as SAP, warehouse management systems, as well as our auditing system. We run something like 11,000 jobs every day across our enterprise. The types of application middleware that we use it to automate include Azure Data Factory, data warehouse jobs, Azure Analysis Services jobs, our PDM database system, as well as our warehouse system that handles product reordering, picking and packing, and e-com orders.
We have multiple ERP systems, and we use it to schedule all of our jobs in the ERP systems, specifically JD Edwards. We also have a lot of integrations using FTP file events to move files around, and we are also using the software to automate all of our manual stopping and starting of services and patching of systems.
Our largest use cases are for the execution of SAP and JD Edwards jobs. Then, there are a lot of other technologies, however, in terms of the Pareto principle, really that's the bulk of our processing. SAP is what we use for our manufacturing and operational type stuff with the actual products. JD Edwards is a lot more of financial reporting and projections and things like that. We use the solution to run SAP and JD Edwards. Windows and Unix hosts are probably the second most common use case, as well as web services. Any REST API would apply - and we use a lot of REST API technology. Protocol, really.
We primarily use Tidal Automation to schedule batch jobs. We are a solution provider and the automation that we implement is for our clients.
We use it for a host of standard/general stuff, like batch workflow automation, in the front and back offices. We have also centralized all of our SQL Server maintenance that is running on it. Instead of having SQL Server maintenance plans or jobs running on 300 or 400 disparate servers, we run them through Tidal so we have consolidated administration and reporting that feeds straight into ServiceNow. Last year, we made a step change with our DR recovery process. We had a bunch of people running manual scripts and different things where you have networks: Wintel, DBAs, or application support teams. They were running their own separate scripts to do application failover. This is different when it's active-active or active-passive replication. What we did was integrate it with different command line driven jobs, like PowerShell commands, to effectively failover applications and infrastructure into a sequenced set of dependant jobs. Therefore, if we need DR, we were not relying on a mix of SMEs saying, "Where was that script or how do we fail this over?" Instead we can just push a button and the thing fails over, which is beautiful. Additionally we do compliance reporting from within Tidal and like many people we are regulated from PWC. Everyone has the technology control frameworks that they have to evidence. Instead of people taking screenshots, we will effectively find out what information PWC need and build the job using CLI which runs on either month or quarter end. The job will go off, collect that evidence, come back, and be formatted. Then, we just drop it in SharePoint or use Tidal to save it to a file share, sending an email off to say, "Your evidence is collected. You need to review it, then sent it onto audit." We use it for a vast array of housekeeping jobs. It is not that Tidal is a monitoring tool, but automation is basically as far as your imagination can take you with anything that runs by a command line, which is virtually anything you can do. We previously had a use case for it to give us a quick alert for when some of our infrastructure became unavailable. We just had it running every minute. Typically, it's not an enterprise monitoring tool, but if you have some deficiencies or things that you need to enhance, or give a different sort of dimension to, we've used it for that in the past. We also run it against our infrastructure using PowerShell to pull a whole host of reporting from our infrastructure daily, which is useful. We use Tidal to run SQL Server and Windows. There is not really any Unix. Since we start using it, they do more stuff in AWS. They now have a whole bunch of different cloud capabilities. We are moving towards private cloud. We're in the sandbox at the moment.
It's a company-wide batch scheduler. It runs tons for us. It runs Windows, Unix/Linux. We connect with a lot of databases: Oracle, SQL, Sybase. We have BusinessObjects BI adapters, we scan emails, and we incorporate it with TriZetto Facets healthcare solutions. There's so much. It's our core enterprise scheduler.
We use Tidal to run jobs across multiple application platforms, such as SAP, ECC, PDN, and Informatica, as well as jobs that run in Azure cloud. We also use it for several warehouse management jobs with OS/400 and AS/400 connectors. We have a lot of different types of connectors, then we are bringing all these jobs into Tidal so we can set up dependencies between jobs that run, e.g., an SAP job and a OS400 job may be dependent on each other in some way, allowing a cross-platform job flow. We are currently on the most recent version.
I have three installs of Tidal: production, qual and dev. I have a portfolio of 12,000 unique job definitions in production, 13,500 definitions in qual, and about 8,000 in dev. The Tidal adapters I use are for Windows and Linux agents, as well as Informatica, Cognos, and mSQL.
We have a product called J.D. Edwards which is our ERP system. Our biggest use case for Tidal is to automate jobs that we submit through J.D. Edwards. Our second use case would be automating maintenance — stopping services, deleting logs — your "keeping the lights on" type of stuff. And our third use case is using it for any automation tasks that we come across. Tidal is our product of choice at the moment. If we're going to automate something, we're going to use Tidal to automate it. We integrate Tidal with Linux, Windows, iSeries, SQL Server, and Oracle, in addition to J.D. Edwards.
We use Tidal extensively to run our health and welfare claims processing throughout the day. That's the reason we got Tidal back in 2011. We receive 15,000 to 20,000 claims a day and we use Tidal to process the whole thing, all the way through to creating checks at the end of the day. Since 2011, we've expanded it to other applications and other processes: mostly reports, and files that come in electronically from other companies that feed other applications. And in a roundabout way, what we use Tidal for is to execute the applications to load whatever needs to be done on those applications. The transfer function we used to do with Tidal has been switched over to another software product called Cleo. And that is run by our network team. That way they can control all the information that comes in and out of our building. They can put secure FTP on it, encrypt and decrypt the information, and set password protections. Cleo has its own scheduler, like Tidal, but they don't use it. They let Tidal execute the Cleo commands to bring the data in and Tidal will execute any application programs after that. Overall we run 1,100 to 1,200 steps every day, depending on day of the week. I call them "steps," but they're actually multiple steps. Before you get to the actual processing of a program there might be a move, a copy, or a delete when we're clearing out folders, using DOS commands. We then move data around to certain directories so that either the TriZetto software that we use can find that data or any internal programs that we use in VBS, .NET, Oracle, or MS SQL stored procedures can find that data. We're also starting to use this new MDM application which captures addresses from various databases, verifies they are correct, and pulls them together into one database. After all of our nightly processing, we have Tidal kick off the main MDM master so that all those addresses are in sync. Tidal sits on its own database and then it talks, through agents, to the other applications.
We use it to call multiple source systems, such as Informatica workflows, Unix scripts, Windows scripts, PowerShell, batch files, and a few SAP web programs. We use it for certain file events and monitors. Tidal, by itself, can't monitor, but we create a script and job for that, then schedule it in Tidal. We use Tidal for multi-purposes. We use Tidal for our SQL Server, where we call from Tidal any procedures, statements, SQLs, or jobs. We also call a few HANA Stored Procedures from it. As of today, Tidal doesn't have an adapter, but we have some internal scripts which call Stored Procedures from Tidal. We run around 2000 to 3000 jobs per day. The infrastructure is in Azure.
Our use of Tidal is mostly file-event driven. We use it to manage our ingestion, processing, and loading of data. Tidal has a hook and it runs ETL for us. It runs jobs and SQL and some of our database appliances like IIAS, the new version of Netezza Teradata. We have a file gateway that receives a file and drops it in a location. That file event picks it up and drops it over to the ETL tool. The ETL tool will run and aggregate a number of source files and turn it into a properly formatted input file. That file then goes through data hygiene and data analysis. Then it goes through a matching process. It is then put back out and runs an ETL process to stick it into a SQL database. And then there are a number of jobs that are run in the SQL database to manipulate that file. We don't have a lot of calendared events or scheduled windows. We have a central location for Tidal in our data center, and then we have client-hosted solutions where we run smaller instances of Tidal, and those are in the cloud. We use AWS, Azure, and GCP.
We use it to manage our batch processing. For us, it came in as a replacement for a lot of different systems running crontab. In our case it's primarily for Unix/Linux systems that don't have their own mechanism for kicking off all these batch processes. It's the coordinator of all of our background processes and batch jobs that are running overnight and during the day. We use it to kick off custom Unix/Linux scripts that will launch our application processes. It's almost entirely Windows and Linux shell scripts that it's kicking off.
We use it primarily to run SAP jobs. While there is other minor stuff it runs in, 98 percent is SAP. We have a number of different types of SAP systems. There are different teams who are responsible for configuring, managing, and setting up jobs. They are the ones who define the jobs and schedule them. There is an administrative team who is responsible for maintaining the system landscape and providing training for Tidal. They also provide standards, guidance, guidelines, and jobs. We use the solution for cross-platform, cross-application workloads within SAP. Therefore, within SAP, we might run a job on one system, but wait for the job on other systems to finish first. That is our interdependency between SAP systems. However, we don't do things like run something on SAP, then go do something on a non-SAP system. We may have a bit of that, but that's not a big part of what we do. It's mostly within SAP systems or within an SAP system.
For most of the companies where I have put Tidal in, it runs everything. It does back office, handles trading, reporting of time, doing a lot of file transfers between vendors and regulatory bodies, etc. We use it to do a whole variety of things. File transfers are our most valuable use cases because those are the ones where we tend to have service level agreements and potential fines. Right now, we are just in a traditional installation with local servers. We use the solution from Hadoop and Workday and are not using adapters from them.
We primarily use it for scheduling our JD Edwards ERP software batch jobs. The solution runs on Windows. It also integrates with our Unix & AIX systems. We use it for automating EDI transactions, so it reaches out to FTP sites as well.
We're running jobs on a global scale. Being a global company, we're running scheduled jobs and ad hoc jobs across different regions. Jobs cover backend processing, financials, and the like. We're running on an SAP ERP system and we're also running Informatica for data warehouse. We're running BusinessObjects web reports as well as a lot of straight Windows and Unix command-line things. We run FTP processing, PGP encryption processing, and data services jobs. We're running about seven or eight of the different adapter types that Tidal has available. We have it on-prem. Both our test and production environments are on fault-tolerant setups.
We are mainly using it for triggering data jobs. It does a lot of ITIL stuff and data movement from systems into Hadoop. We use it because it has the capability of dependency triggering or dependency running. That's the main idea behind it. Also, it helps us to centralize and organize jobs across the organization. We use Tidal to run Hadoop backup system, SAP HANA, and SAP BusinessObjects. We also trigger a lot of jobs into SnapLogic, Salesforce, ServiceNow, Workday, and Tableau, along with a couple of dashboards. We run a couple of batches from our Unix and Windows machines: the stuff that the developers are working on and want to run in ITIL. But, SAP is the main thing. The main goal is to use Tidal for managing and monitoring cross-platform, cross-application workloads. The ability to manage those loads is what they do well. I can put a job to run in SAP, and once the job ends successfully, I can run that job in Hadoop. Or, I can run that job in Salesforce.