ActiveBatch helped us to streamline our IT workflows and improve overall efficiency. It is the most essential tool in our Infrastructure now. The best part of ActiceBatch is its user-friendly Interface, which feels easy to use even for a user with limited technical experience. We can configure and design complex workflows. It also offers pre-built templates and wizards which can be used to easily and quickly create automated workflows. ActiveBatch also includes the option to implement it in on-premises, cloud, and hybrid deployments.
As sales operations analysts, our main task is to deal with cumbersome data, forecasting, and sharing these cleaned data with our global partners. We clean these data and store it in consumable Excel files and then upload these to SQL servers which are in turn connected to visualization tools and we often refresh these tools to publish our dashboards in service. ActiveBatch has streamlined all these steps with automation and no manual intervention which has helped to decrease errors.
Cyber Security Analyst at Tata Consultancy Service
Real User
Top 5
2023-08-04T12:02:00Z
Aug 4, 2023
We have a security project where we need to perform daily scans on a number of our servers and network infrastructure components and keep a check on their health and status. We have implemented the ActiveBatch to perform endpoint security scans on our environment for each and every component and provide us with a detailed report stating their health as well as updating on that server and components that need upgrades. We have scheduled the scans to take place every 12 hours on a daily basis and provide the major stakeholder with detailed reports.
We used the solution extensively in material planning, material transfer, SCM activities (such as outsourcing, purchasing, OEM reworks, production planning, manufacturing BOMs, work order closures, calculating and identifying SLE of material, scrap stores, customer and supplier tracks, finance, invoice billing, and securely managing data processing and data transfer). The main issue we encountered was that we users could not customize the software as needed as different organizations have different working cultures and different aerospace standards to maintain. Therefore, for any new improvements, we had to contact the service engineer and discuss the requirements. Except for this, the other functions were fantastic, with a little software training to understand the purpose of each function.
As a QA engineer, monitoring logs in the production environment was one tedious task. This was time-consuming and required lots of manual effort. However, using ActiveBatch Workload minimized the downtime and maximized productivity. Job scheduling is another major advantage of this tool. At the time, there were nightly batch jobs like Trigger Service, User Service, Notifications Service, and many more, which were easily handled by ActiveBatch Workload and made our job simple and effortless. We currently have 25 jobs running on this platform with different environments.
ActiveBatch is used for multiple purposes, including as SAP jobs, file transfer systems (FTP), and data warehouse loads. ActiveBatch has numerous functionalities that support different types of workflows, including batch calls and service calls. The format of job scheduling is well organized and very similar to what we use in everyday life hence making it easy to use. It is able to control jobs for multiple environments over and across different servers. We can use and set up automated remediation and alerts for operations that we have created.
DevOps Engineer at HTC Global Services (INDIA) Private
Real User
Top 5
2023-04-09T12:26:00Z
Apr 9, 2023
The primary use case would be for monitoring the servers along with alerts and logs. Earlier, I had to do a lot of work on this manually and it was very time-consuming. Since the ActiveBatch Workload Automation has been implemented, everything is smooth. Also, we use it for job scheduling and server maintenance. This has been good. More than anything, when managing workload balance and multiple platforms, this solution helps me to avoid switching across platforms and keeps an eye on them as the tool automatically takes care of it. Apart from this, the integrations for APIs have been very helpful as well.
We have connected this automation tool to our SAP ECC system. All the ECC batch jobs are scheduled via this tool. We had configured the alerting mechanism so that whenever we have any job fails/long-running jobs, we get an immediate email notification which helps in monitoring the jobs. The best part of the tool is the submit frame and time window where we can schedule jobs as per the customer's requirements. As this has a quality environment, we connected this tool to ECC QA, and before making any changes to production, we are able to test the new requirements in quality then we can move to production.
The task scheduling was manual earlier, and it was a very difficult and time-consuming task to schedule jobs. Even after this process, it was difficult to find out what was wrong if the job/task has been failed. With the help of ActiveBatch Workload Automation, the entire workflow automation has been automated, and we can schedule our entire process from a single point of contact with features like real-time monitoring. We're streamlining IT operations, getting centralized security, and having quick and reliable integration features that are game changers in the industry.
Senior Data Engineer at a insurance company with 501-1,000 employees
Real User
2022-06-14T12:51:00Z
Jun 14, 2022
We used this solution for automating our batch processing which includes data load, refreshing data marts and refreshing the Power BI reports. We were using it for end-to-end automation, primarily for data delivery to reporting. We used every end-to-end process. The company is a customer of ActiveBatch.
Manager at a financial services firm with 501-1,000 employees
Real User
2021-11-23T21:41:10Z
Nov 23, 2021
ActiveBatch Workload Automation is a standard scheduling tool that you have on the market. The ultimate goal is to run everything powered through ActiveBatch Workload Automation, but we are always constantly trying to move from our legacy processes, which always takes a lot of time and effort. However, all of the new processes we are focused on implementing through ActiveBatch Workload Automation.
Production Control Manager at a tech services company with 51-200 employees
Real User
2021-02-07T11:28:00Z
Feb 7, 2021
We provide parking enforcement support for cities around the USA. So, if you are a municipality, then you may have a contract with us. We would provide you with services that would range from parking enforcement to tollway enforcement. It really depends on the end user and what the community's business is. All of our automation runs through ActiveBatch. We have probably close to 2,500 jobs running each day that provide support for different municipalities around the US. All of our clients' data comes to us via a scheduled set of file movements within the arrangement of ActiveBatch. At midnight, every night, we get every ticket that a municipality issued in the last 24 hours, then we put that into our database so the municipality can ensure that they get that money collected within a reasonable length of time for collection purposes. Each community has its own set of required rules that have to be followed, e.g., what kind of delay can happen before you make sure you collect on the debt from the citizen for having had a parking violation to when the next time you are going to go out and try to double check if they have not paid their fines. It is deployed via our own internal network connections. It is a locally-sourced platform for us. We don't have a lot of really complex job flows. It just isn't the nature of our business, because you can't really take municipalities data someplace else. However, our data is shared in a data center in Wisconsin and a data center in Indiana, thus our data is in both locations every day.
BI Data Integration Developer - EIM at a healthcare company with 10,001+ employees
Real User
2021-01-29T02:38:00Z
Jan 29, 2021
Primarily, we've been using it in a localized way, but it's becoming more and more of an enterprise tool as the knowledge is shared throughout the team and department. But primarily it has been used for ETL-type work. My team is data integration and we use it to schedule our Informatica PowerCenter workflows as well as DataStage. We also use it for a lot of file transfers, such as SFTP stuff. And we've recently explored some API calls that we can use to interface with Qlik.
It does a little bit of everything. We have everything from console apps that our developers create to custom jobs built directly in ActiveBatch, which go through the process of moving data off of cloud servers, like SFTP, onto our on-premise servers so we can ingest them into other workflows, console apps, or whatever the business needs.
Senior System Analyst at a insurance company with 5,001-10,000 employees
Real User
2020-11-05T06:53:00Z
Nov 5, 2020
We have roughly 8,000 jobs that run every day and they manage anything from SaaS to Python to PowerShell to batch, Cognos, and Tableau. We run a lot of plans that involve a lot of constraints requiring them to look at other jobs that have to run before they do. Some of these plans are fairly complicated and others are reasonably simple. We also pull information from SharePoint and load that data into Greenplum, which is our main database. SharePoint provides the CSV file and we then move it across to Linux, which is where our main agent is that actually loads into the Greenplum environment. Source systems acquire data that goes into Greenplum. There are a number of materialized views that get populated, and that populating is done through ActiveBatch. ActiveBatch then triggers the Tableau refresh so that the reports that pull from those tables in Greenplum are updated. That means from just a bit after source acquisition, through to the Tableau end report, ActiveBatch is quite involved in that process of moving data. We have 19 agents if you include the Linux environment, and 23 if you count the dev environments. It's huge. It's on-prem. We manage the agents and the scheduler on a combination of Windows and Linux.
Systems Architect at a insurance company with 201-500 employees
Real User
2020-11-04T07:28:00Z
Nov 4, 2020
We are using ActiveBatch to automate as many of our processes as we can, limiting the amount of time operators are running recurring jobs. Included in that is about 99.5 percent of our nightly cycles. We call a mixture of executables: SSIS jobs, SQL queries, and PowerShell scripts. We also call processes in both PeopleSoft and another third-party package software.
I am the administrator handling all of the ActiveBatch-related activities. It is used for all of our processes, scheduling, and basically all of the automation.
Senior Operations Administrator at Illinois Mutual Life Insurance Company
Real User
2020-04-02T07:00:00Z
Apr 2, 2020
ActiveBatch is used for scheduling our nightly batch processes. That is our main use at this point. It includes billing, processing, claims, commission statements, and a lot of reporting. It's all tied into that batch process. We do use the built-in REST call process for nightly printing, coming out of that batch cycle. We distribute the nightly reports out of the batch cycle to different departments using ActiveBatch. It's used for FTP processing every week coming out of the weekly commissions process. The most important part to us is to keep those nightly batch cycles in an easy to read format, which is where ActiveBatch Plans come into play. We run these cycles in four different environments, from development to production and a couple stops in between. Keeping all of those jobs separate from one another is key for us. Outside of batch, we do run a process every five minutes throughout the day during business hours to scrape data from our mainframe entry system to our new policy administration system. As people enter claims into the mainframe system, those claims get moved over within five minutes, rather than waiting for the mainframe batch cycle to run that night and those claims not being seen until the next day. That saves us up to 24 hours. The business end-users can get that data within five minutes now.
Most of the jobs are for the automation of processes, but we also use it for IT operations, including monitoring. We execute over 20,000 jobs daily. It's moving data files and doing a lot of calculations in hydrology and the like. The business users are maintaining their own jobs, setting them up, configuring, and maintaining them. They only contact us, in IT, if there are any problems. ActiveBatch is completely on-prem but the rest of our organization has many different kinds of infrastructure and locations, both in the cloud and in 16 countries. We have about 4,000 employees.
Data Warehouse Operations Analyst at a leisure / travel company with 1,001-5,000 employees
Real User
2020-03-31T06:37:00Z
Mar 31, 2020
We use ActiveBatch to run the data warehouse production batch schedule, which is 24/7. We run, on average, about 200 distinct workflows each day to update the warehouse. And once the warehouse tables are loaded, we trigger our business intelligence reports and our analytics reports. We also use ActiveBatch to run a software tool called iCEDQ for data quality, as well as some Alteryx jobs. Our production servers are in a co-location, and the solution is deployed onsite there.
Senior IT Architect at a pharma/biotech company with 5,001-10,000 employees
Real User
2020-03-31T06:37:00Z
Mar 31, 2020
We use it for a variety of different tasks, most of which are related to data management tasks, such as scheduling, processes related to updating business intelligence reporting, or general data management stuff. It's also used for some low level file transfers and mergers in some cases. We use the solution for execution on hybrid machines, across on-prem, and cloud systems. We have code that it is executed on a cloud environment, various Windows and Unix servers. We are on version 11, moving to version 12 later this year.
Client Service Manager/Programmer at a tech vendor with 51-200 employees
Real User
2020-03-29T08:26:00Z
Mar 29, 2020
In our company we deal with a lot of data processing. Clients will send us extract files that we load into our system so that we can run calculations. And all of that is orchestrated using ActiveBatch automation. To summarize, we have software that we use to calculate values, but we need to receive the files from the client, get them to the right spot, and get them ready for processing. All of those steps are done using the automation tool. The integrations we mainly use it with are FTP and SQL and we use a batch file or a script file to call our internal programs. It does have the ability to call PowerShell scripts and we do use some of those. We just don't have a need to use a lot of PowerShell because most of our software is designed using a different language.
Supervisor IT Operations at a insurance company with 501-1,000 employees
Real User
2020-03-29T08:26:00Z
Mar 29, 2020
ActiveBatch controls just about everything in our organization. We do server monitoring with our EDI feeds being inbound and outbound. We do Oracle processing with it. It is very comprehensive for what we do and a central point of everything in our organization at this point.
ActiveBatch by Redwood automates and manages batch processes, data integration tasks, and workflow scheduling. It's used for file transfers, data processing, server monitoring, and report generation, supporting both on-prem and cloud environments.
Organizations implement ActiveBatch by Redwood to automate complex job scheduling and data workflows, integrating seamlessly with FTP, SQL, PowerShell, and other systems. With features like real-time monitoring, error handling, and centralized...
ActiveBatch helped us to streamline our IT workflows and improve overall efficiency. It is the most essential tool in our Infrastructure now. The best part of ActiceBatch is its user-friendly Interface, which feels easy to use even for a user with limited technical experience. We can configure and design complex workflows. It also offers pre-built templates and wizards which can be used to easily and quickly create automated workflows. ActiveBatch also includes the option to implement it in on-premises, cloud, and hybrid deployments.
As sales operations analysts, our main task is to deal with cumbersome data, forecasting, and sharing these cleaned data with our global partners. We clean these data and store it in consumable Excel files and then upload these to SQL servers which are in turn connected to visualization tools and we often refresh these tools to publish our dashboards in service. ActiveBatch has streamlined all these steps with automation and no manual intervention which has helped to decrease errors.
We have a security project where we need to perform daily scans on a number of our servers and network infrastructure components and keep a check on their health and status. We have implemented the ActiveBatch to perform endpoint security scans on our environment for each and every component and provide us with a detailed report stating their health as well as updating on that server and components that need upgrades. We have scheduled the scans to take place every 12 hours on a daily basis and provide the major stakeholder with detailed reports.
We used the solution extensively in material planning, material transfer, SCM activities (such as outsourcing, purchasing, OEM reworks, production planning, manufacturing BOMs, work order closures, calculating and identifying SLE of material, scrap stores, customer and supplier tracks, finance, invoice billing, and securely managing data processing and data transfer). The main issue we encountered was that we users could not customize the software as needed as different organizations have different working cultures and different aerospace standards to maintain. Therefore, for any new improvements, we had to contact the service engineer and discuss the requirements. Except for this, the other functions were fantastic, with a little software training to understand the purpose of each function.
As a QA engineer, monitoring logs in the production environment was one tedious task. This was time-consuming and required lots of manual effort. However, using ActiveBatch Workload minimized the downtime and maximized productivity. Job scheduling is another major advantage of this tool. At the time, there were nightly batch jobs like Trigger Service, User Service, Notifications Service, and many more, which were easily handled by ActiveBatch Workload and made our job simple and effortless. We currently have 25 jobs running on this platform with different environments.
ActiveBatch is used for multiple purposes, including as SAP jobs, file transfer systems (FTP), and data warehouse loads. ActiveBatch has numerous functionalities that support different types of workflows, including batch calls and service calls. The format of job scheduling is well organized and very similar to what we use in everyday life hence making it easy to use. It is able to control jobs for multiple environments over and across different servers. We can use and set up automated remediation and alerts for operations that we have created.
The primary use case would be for monitoring the servers along with alerts and logs. Earlier, I had to do a lot of work on this manually and it was very time-consuming. Since the ActiveBatch Workload Automation has been implemented, everything is smooth. Also, we use it for job scheduling and server maintenance. This has been good. More than anything, when managing workload balance and multiple platforms, this solution helps me to avoid switching across platforms and keeps an eye on them as the tool automatically takes care of it. Apart from this, the integrations for APIs have been very helpful as well.
We have connected this automation tool to our SAP ECC system. All the ECC batch jobs are scheduled via this tool. We had configured the alerting mechanism so that whenever we have any job fails/long-running jobs, we get an immediate email notification which helps in monitoring the jobs. The best part of the tool is the submit frame and time window where we can schedule jobs as per the customer's requirements. As this has a quality environment, we connected this tool to ECC QA, and before making any changes to production, we are able to test the new requirements in quality then we can move to production.
The task scheduling was manual earlier, and it was a very difficult and time-consuming task to schedule jobs. Even after this process, it was difficult to find out what was wrong if the job/task has been failed. With the help of ActiveBatch Workload Automation, the entire workflow automation has been automated, and we can schedule our entire process from a single point of contact with features like real-time monitoring. We're streamlining IT operations, getting centralized security, and having quick and reliable integration features that are game changers in the industry.
We use ActiveBatch Workload Automation primarily for managing work schedules.
We used this solution for automating our batch processing which includes data load, refreshing data marts and refreshing the Power BI reports. We were using it for end-to-end automation, primarily for data delivery to reporting. We used every end-to-end process. The company is a customer of ActiveBatch.
ActiveBatch Workload Automation is a standard scheduling tool that you have on the market. The ultimate goal is to run everything powered through ActiveBatch Workload Automation, but we are always constantly trying to move from our legacy processes, which always takes a lot of time and effort. However, all of the new processes we are focused on implementing through ActiveBatch Workload Automation.
We provide parking enforcement support for cities around the USA. So, if you are a municipality, then you may have a contract with us. We would provide you with services that would range from parking enforcement to tollway enforcement. It really depends on the end user and what the community's business is. All of our automation runs through ActiveBatch. We have probably close to 2,500 jobs running each day that provide support for different municipalities around the US. All of our clients' data comes to us via a scheduled set of file movements within the arrangement of ActiveBatch. At midnight, every night, we get every ticket that a municipality issued in the last 24 hours, then we put that into our database so the municipality can ensure that they get that money collected within a reasonable length of time for collection purposes. Each community has its own set of required rules that have to be followed, e.g., what kind of delay can happen before you make sure you collect on the debt from the citizen for having had a parking violation to when the next time you are going to go out and try to double check if they have not paid their fines. It is deployed via our own internal network connections. It is a locally-sourced platform for us. We don't have a lot of really complex job flows. It just isn't the nature of our business, because you can't really take municipalities data someplace else. However, our data is shared in a data center in Wisconsin and a data center in Indiana, thus our data is in both locations every day.
Primarily, we've been using it in a localized way, but it's becoming more and more of an enterprise tool as the knowledge is shared throughout the team and department. But primarily it has been used for ETL-type work. My team is data integration and we use it to schedule our Informatica PowerCenter workflows as well as DataStage. We also use it for a lot of file transfers, such as SFTP stuff. And we've recently explored some API calls that we can use to interface with Qlik.
It does a little bit of everything. We have everything from console apps that our developers create to custom jobs built directly in ActiveBatch, which go through the process of moving data off of cloud servers, like SFTP, onto our on-premise servers so we can ingest them into other workflows, console apps, or whatever the business needs.
We have roughly 8,000 jobs that run every day and they manage anything from SaaS to Python to PowerShell to batch, Cognos, and Tableau. We run a lot of plans that involve a lot of constraints requiring them to look at other jobs that have to run before they do. Some of these plans are fairly complicated and others are reasonably simple. We also pull information from SharePoint and load that data into Greenplum, which is our main database. SharePoint provides the CSV file and we then move it across to Linux, which is where our main agent is that actually loads into the Greenplum environment. Source systems acquire data that goes into Greenplum. There are a number of materialized views that get populated, and that populating is done through ActiveBatch. ActiveBatch then triggers the Tableau refresh so that the reports that pull from those tables in Greenplum are updated. That means from just a bit after source acquisition, through to the Tableau end report, ActiveBatch is quite involved in that process of moving data. We have 19 agents if you include the Linux environment, and 23 if you count the dev environments. It's huge. It's on-prem. We manage the agents and the scheduler on a combination of Windows and Linux.
We are using ActiveBatch to automate as many of our processes as we can, limiting the amount of time operators are running recurring jobs. Included in that is about 99.5 percent of our nightly cycles. We call a mixture of executables: SSIS jobs, SQL queries, and PowerShell scripts. We also call processes in both PeopleSoft and another third-party package software.
I am the administrator handling all of the ActiveBatch-related activities. It is used for all of our processes, scheduling, and basically all of the automation.
ActiveBatch is used for scheduling our nightly batch processes. That is our main use at this point. It includes billing, processing, claims, commission statements, and a lot of reporting. It's all tied into that batch process. We do use the built-in REST call process for nightly printing, coming out of that batch cycle. We distribute the nightly reports out of the batch cycle to different departments using ActiveBatch. It's used for FTP processing every week coming out of the weekly commissions process. The most important part to us is to keep those nightly batch cycles in an easy to read format, which is where ActiveBatch Plans come into play. We run these cycles in four different environments, from development to production and a couple stops in between. Keeping all of those jobs separate from one another is key for us. Outside of batch, we do run a process every five minutes throughout the day during business hours to scrape data from our mainframe entry system to our new policy administration system. As people enter claims into the mainframe system, those claims get moved over within five minutes, rather than waiting for the mainframe batch cycle to run that night and those claims not being seen until the next day. That saves us up to 24 hours. The business end-users can get that data within five minutes now.
Most of the jobs are for the automation of processes, but we also use it for IT operations, including monitoring. We execute over 20,000 jobs daily. It's moving data files and doing a lot of calculations in hydrology and the like. The business users are maintaining their own jobs, setting them up, configuring, and maintaining them. They only contact us, in IT, if there are any problems. ActiveBatch is completely on-prem but the rest of our organization has many different kinds of infrastructure and locations, both in the cloud and in 16 countries. We have about 4,000 employees.
We use ActiveBatch to run the data warehouse production batch schedule, which is 24/7. We run, on average, about 200 distinct workflows each day to update the warehouse. And once the warehouse tables are loaded, we trigger our business intelligence reports and our analytics reports. We also use ActiveBatch to run a software tool called iCEDQ for data quality, as well as some Alteryx jobs. Our production servers are in a co-location, and the solution is deployed onsite there.
We use it for a variety of different tasks, most of which are related to data management tasks, such as scheduling, processes related to updating business intelligence reporting, or general data management stuff. It's also used for some low level file transfers and mergers in some cases. We use the solution for execution on hybrid machines, across on-prem, and cloud systems. We have code that it is executed on a cloud environment, various Windows and Unix servers. We are on version 11, moving to version 12 later this year.
In our company we deal with a lot of data processing. Clients will send us extract files that we load into our system so that we can run calculations. And all of that is orchestrated using ActiveBatch automation. To summarize, we have software that we use to calculate values, but we need to receive the files from the client, get them to the right spot, and get them ready for processing. All of those steps are done using the automation tool. The integrations we mainly use it with are FTP and SQL and we use a batch file or a script file to call our internal programs. It does have the ability to call PowerShell scripts and we do use some of those. We just don't have a need to use a lot of PowerShell because most of our software is designed using a different language.
ActiveBatch controls just about everything in our organization. We do server monitoring with our EDI feeds being inbound and outbound. We do Oracle processing with it. It is very comprehensive for what we do and a central point of everything in our organization at this point.