SEEBURGER, along with four or five other big vendors, focuses in the integration space. When you talk about data integration, there are two major aspects of it. There is the transactional messaging side and the batch file-based side. My team is focused on the batch file-based side of things. We have a completely different team (with a different set of software) who does the transactional messaging aspects. We are using it for all secure file transfer use cases throughout the organization with multiple different data patterns: moving data within the company, moving data to and from outside of the company, and ad hoc file transfer. Any type of file-based secure connectivity goes through our team using this product.
Currently, it is on-prem. We have a cloud initiative, which has been rolling for a couple of years now, like most companies. It is on our radar for later this year. We are going to spin up another project to consider either moving to our own AWS or SEEBURGER's AWS and their iPaaS environment.
No matter how much you automate in the file transfer space, there is always more to be uncovered in a big company. Users, especially in the business area, will start doing their own thing and interacting with some external webpage to upload or download files manually every day. They kind of incorporate it into their daily tasks. When we discovered that, we were like, "Why are you wasting all that time? What if you are out?" That has been one part of starting to consume different website APIs, to push and pull files to and from various vendor websites that was historically done by users manually. So, there is an automation aspect to it. Beyond that, application-to-application connectivity, which historically went over protocols like SFTP or batch files, is conforming over to APIs now because everybody wants to be faster and use APIs. Therefore, a lot of these application-to-application data flows have changed over to APIs over the last year. For example, I used to get one API request a year in past years. This year, I get one or two new ones every week or two. APIs are just taking over.
It is good anytime that you can take a user who is doing something manually every day out of the picture and automate a process, e.g., going to a vendor's webpage to pull or push a file every day. Although there was a one-time cost to do the development work, you reap the ongoing benefits because now you don't need to have that user spending time doing it every day. You don't have to worry about if that user is out or gone for a week on vacation. Things can just happen automatically. There is definitely a benefit.
Because we are in the financial services industry, PCI is huge. You have to comply with PCI regulations. That has primarily to do with credit card numbers, but really any account number or sensitive data. What is nice about SEEBURGER BIS is it has made it easy to patch our yearly PCI audits with this thing called PCI realm. You can configure any of your data flows into that PCI realm that you need to. It automatically complies with regulations, offloading the data as soon as the data flows through the system. It doesn't store any copies. It offloads the data encrypted in an encrypted state to our PCI zone to be stored for X days during our backup period. That is all out-of-the-box functionality, so you don't have to waste your time trying to figure out how to comply with PCI compliance rules because it is already built into the software. You just have to configure it in that PCI realm.
We are getting these use cases now that APIs are coming into the picture where, historically in our company, the data integration has been broken into the two major areas. Transactional messaging is on one side and batch file-based transfers are on the other side. Now, you are starting to see those two areas kind of merged together. Because usually when you get a batch file over API, they want us to break that batch file up into individual transactions, iterate over all the records in that batch file, and post them as transactions into our messaging system.
There is now this interesting sort of convergence between the messaging space and the batch file-based space which is now sort of coming together because of APIs. So, this is another area that I am seeing a lot of requests for lately, "Hey, I want you to still get a batch file like you have always done in the past, but it is going to come over API now instead of over something like SFTP. In the API, I want you to iterate over every record in that batch file and post those transactions individually." This is another big growth area right now. Therefore, I am working on solutions to be able to support that. This would be an area of growth because we will be using these batch files to post into more internal systems and do it more flexibly as individual transactions, instead of as a big batch file. This is an area that we are looking to grow in the next year.
You could make the argument for time to market if there was a user doing something manually every day, then we came along with SEEBURGER BIS and automated that. So, instead of waiting around for the user to load the file once a day at whatever time they do it, we had the system automatically pull the file, possibly early morning, and pass it through immediately into the back-end processing system. There are some cases like that, where historically it took eight hours, because a user had to get engaged, do something manually, maybe convert something manually to transform the data themselves, and then they would have to manually load it. Here, we come along and run a workflow in two seconds that streams it right through. Therefore, you could make some arguments that we sped up time to market by automating previously manual processing. However, as far as just general B2B file transfer or application-to-application, I don't know that you could claim that the software itself has sped up time to market, other than just coding workflows a bit more efficiently to take out seconds or minutes to make them run faster.