Try our new research platform with insights from 80,000+ expert users
Senior Technology Architect at a tech vendor with 10,001+ employees
Real User
Enables continuous testing and integration by generating the required data in advance
Pros and Cons
  • "The combination of extract, mask, and load, along with synthetic data, is what is generally needed by any of our clients, and CA TDM has really good compatibility in both of the areas."
  • "CA is one of the few tool suites that has end-to-end features. Whatever role you are playing, whatever personality you are trying to address, it has that feature. For example, CA Service Virtualization goes hand-in-hand with TDM."
  • "It has a feature called TDM Portal, where testers can find test data by themselves, based on multiple models. They can reserve the data so that it belongs to one group or individual. Obviously, that data is not available to anybody else... This feature is for one environment. But if a different group of testers wanted that data for a different environment, they can't use it via CA TDM. That feature doesn't exist."

What is our primary use case?

TDM is something people do all the time. You cannot say it is something you're going to do from scratch. For every client, there is a different scenario. There are a lot of use cases. But a couple of use cases are common everywhere. One of them is the data when it is not there in production. How do you create that data? Synthetic data creation is one use case challenge that is common across the board.

In addition, the people who do the testing are not very conversant with the back end or with the different types of databases, mainframes, etc. And most of the time they don't write very good SQL to be able to find the data they are going to do their testing with. So data mining is a major concern in most places. 

The use cases are diverse. You cannot point to many common things and say that this will work or this will not. Every place, even though it's a TDM scenario, is different. Some places have very good documentation, so you can directly start with extraction, masking, and loading. But for most places that is not possible because the documentation is not there. There are multiple use cases. You cannot say that one size fits all.

In the testing cycle, when there is a need for test data management tools, we use CA TDM to put up the feed.

How has it helped my organization?

If you take DevOps as an example, suppose development has happened and the binary code has been deployed to a certain server. To do continuous testing or continuous integration, you need the test data. CA TDM has a feature where it can generate the required data beforehand and keep it with your test cases however you need it. If you are using JIRA it will put the test data in JIRA. If you are using ALM it will give data to HPE ALM. Once you are running your test cases in an automated way, the data is already there. 

And the data provided will work for all the results. For example, if you want to do a set of automated scenarios, it will work with that. If you want it to work with various regression cycles, it will work with that. And the same data, or maybe a different set of data that you provide, will work with the unit cycles as well. CA has the ability to provide all the data on-demand as well as on the fly.

What is most valuable?

The combination of extract, mask, and load, along with synthetic data, is what is generally needed by any of our clients, and CA TDM has really good compatibility in both of these areas.

CA is one of the few tool suites that has end-to-end features. Whatever role you are playing, whatever personality you are trying to address, it has that feature. For example, CA Service Virtualization goes hand-in-hand with TDM. In addition, TDM has automation. CA has most features that complement the whole testing cycle.

CA TDM has open APIs. If we are going to use a set of Excel data and pull out the feed and we want to help the Service Virtualization by providing a set of dynamic responses to the request that that service layer is getting, how do we do that? We can use the API layer at the moment the whole process stabilizes, after three months or so. In other tools, that takes longer. This open API capability is good in CA TDM.

What needs improvement?

There are multiple things which can be improved in CA TDM. It has a feature called TDM Portal, where testers can find test data by themselves, based on multiple models. They can reserve the data so that it belongs to one group or individual. Obviously, that data is not available to anybody else. Without this tool, if somebody goes through the back end, via SQL and pulls that data, you can't do anything. But through the Portal, if somebody reserves the data, it's his. This feature is for one environment. But if a different group of testers wanted that data for a different environment, they can't use it via CA TDM. That feature doesn't exist. You have to build a portal or you have to bridge the two environments. That is a big challenge.

Buyer's Guide
Broadcom Test Data Manager
December 2024
Learn what your peers think about Broadcom Test Data Manager. Get advice and tips from experienced pros sharing their opinions. Updated: December 2024.
824,067 professionals have used our research since 2012.

For how long have I used the solution?

Three to five years.

What do I think about the scalability of the solution?

You can envision TDM happening at three layers. One is the application layer, the second is a cluster layer, and the third is an end-to-end layer. The data required for the first level and the second level are pretty different. You can't use first-level data in the second level. And the data required for the third, for end-to-end testing, is very different from the first two layers.

So when we look at scalability, we have to see how we are creating the "journey" from one layer to another. For example, if we are working in the customer area and then we jump to payments, we have to see what the common things are that we can scale and what areas we have not tested and address them.

How was the initial setup?

The initial setup is always complex. I have been working in testing environments for the last 10 or eleven years. From what I have seen, most companies lack the basic building blocks for testing.

Suppose I have a system, and that system gives data to system Y, and Y gives data to system Z. Nobody has a clue how that data gets there for testing, because that end-to-end testing has never happened. We cannot give someone data which will be rejected from system Z. We have to give him data which will pass across all the systems. And that means we have to understand the mapping file behind it. However, the mapping file is often not there so we have to create it.

We have to talk about the various models, are they logical or physical? Somebody may have created a set of logical data models 20 years back but it is not usable now. We have to work with the tool to create that set of data.

We also have to consider the scheme of values. If it's IMS, that is different from RDBMS. We have to find out what segment has more data, which segment is completing and which segment is giving data to systems. When we talk to the people who are working on that data set, one that is 20 years old or 30 years old, 90 percent of the time they don't have a clue. They are working with various tools but they don't have a clue how it is happening.

So there are always multiple challenges at the start. But then we do due diligence for six or eight weeks and it clears up all the cobwebs: What is there, what is not there, and the roadmap. That puts a foot forward so we cna say, "Okay, this is how we should move and this is what we should be able to achieve in a given timeline."

The initial deployment will take a minimum of three to four weeks.

The second step is a PoC or a pilot to run with a set of use cases.

Which other solutions did I evaluate?

Apart from CA TDM, I've used IBM InfoSphere Optim, which was number-one TDM tool for quite some time, and I've used Delphix. Now, a couple of more tools have come into the market, like K2View. At one point in time, about two years back, CA TDM was only tool that could do synthetic data.

CA TDM and Optim have different ways of working. CA TDM vs Optim has the major advantage of synthetic data creation. No tool was able to do that. Only in the last two years has IBM Optim come up with synthetic data capabilities, but what they are doing is creating a superset. If you have sample data, it will create a superset of that data. That is not the case with CA, as well as other tools.

There are multiple sites that also create synthetic data, but the major challenge comes into the play once you need to put that data back into the database.

What other advice do I have?

There are, let's say, five market-standard tools you can choose from. If you choose CA TDM, you need to bring out all your questions for your PoC journey. You have four weeks to get answers to whatever questions you have. There is a set of experts at CA and partners have expertise as well. Both will be able to answer your questions.

Next, you need to supply a roadmap. For example, "I need X, Y, and Z to be tackled first." And the roadmap that comes out of the due diligence needs to be followed word-for-word. So proper planning is essential.

There are three teams which are at the base of your TDM journey. One team is a central data commandment team, one is a federated team, and the third is for creating small tools that you might require at that point in time. To start, you need three to four people. But we have gotten into all types of data: Big Data, RPA, performance; etc. Wherever data is needed, our team is providing the data. In a bank, for example, where I did two rounds of due diligence, one lasting eight weeks and the other, three years later, lasting six weeks, we even implemented bots. When we started there the team was 50. Even though we automated the whole thing, more than what anyone might have even imagined, the team is still 40-plus.

Disclosure: My company has a business relationship with this vendor other than being a customer: Preferred Partner.
PeerSpot user
Practice Head - Digital Testing at a tech services company with 10,001+ employees
Real User
Synthetic data generation enables us to create multiple copies of similar data, but the UI needs improvement
Pros and Cons
  • "The synthetic data generation is really good... You can write rules and create permutations and combinations according to your needs. Or you can take a snippet of the Prod data and replicate it."
  • "The integration with various utilities is also really important. That still has to happen. That's a major area for improvement."

What is our primary use case?

We use it for enterprise-level solutions.

How has it helped my organization?

While we are testing, when there is data that's not accessible or we need to quickly generate data, TDM comes in handy. We can create batch files as well. We can write scripts which automatically create data and we can integrate it with the automatic Dev scripts. This feature is very good. We have used these kinds of features for smaller solutions, although not at a very large scale, because of the complexities involved in the enterprise-level data.

What is most valuable?

The entire tool is good and I like the synthetic data generation, that's really good. It's valuable because you don't have Prod data so, instead, you can create multiple copies of similar data. You can write rules and create permutations and combinations according to your needs. Or you can take a snippet of the Prod data and replicate it. All of that is really helpful.

What needs improvement?

The UI could be improved and I see they are going to web-based. That's still in progress but I really hope all of that happens pretty soon and the entire UI gets migrated from the desktop to web-based.

The integration with various utilities is also really important. That still has to happen. That's a major area for improvement.

For how long have I used the solution?

Three to five years.

What do I think about the stability of the solution?

It has become pretty stable over the past couple of years. When it started it had issues but right now, I don't think there are any major issues.

What do I think about the scalability of the solution?

It's a tool so scalability depends on you use it. Scalability is pretty relative. It provides a lot of features and it's up to you how you utilize them. It's pretty scalable. It has automated features and I don't think there is any other tool in the market which provides such a level of automated solutions. The demand in the industry, with respect to enterprise solutions, is pretty complex and CA TDM is pretty good. It is scalable but not to the extent that a foolproof enterprise solution can be provided using this tool.

How are customer service and technical support?

Support is pretty good. We get answers to problems most of the time and, if we don't, they get in touch with the tech team and we get on a call with them and we figure it out together.

How was the initial setup?

The setup is of medium complexity. It's been a long time since I set it up. I have had it on my laptop for a long time, but this is what I remember. The configuration does not happen by clicking a button and then you can start using it. It has its own steps. You register the depository, etc., to get into the tool. The installation itself is fine, but configuring it and getting it ready to use could be better.

The time it takes depends. At times I have installed it in a couple of hours, but if I get stuck... I don't remember all the issues I have faced, it's been a while, but I do remember that I had issues.

Every project and every implementation have to have a strategy. There are a few basic things that we look for and we follow a checklist to see if the project is feasible for TDM or model-based testing or some other solution. As far as implementation strategies are concerned, they are very specific to the client and the kind of ecosystem the client has. The basic strategy would be to not go "big-bang," to start with the basic and medium-complexity tests to show the ROI, and then roll it out one-by-one across the enterprise. But there can be a lot of nuances in the strategy document.

In terms of the number of staff needed for deployment, to start with we would not need more than two people to perform the PoC and do due diligence on the requirements. We would need two to three people in a bigger organization and one person for a smaller solution It depends on the requirements and on how much work is involved. To maintain it, one person should be enough.

What was our ROI?

Nothing happens quickly. It requires six to eight months to show a return on investment, minimum. You are going to invest in the tool, then you are going to do training, then you are going to do roll it out. And organizations have different project teams. They have to change the mindset. That process takes time. It's good when it happens. Once you have the system in place, after something like a year-and-a-half you'll see a good enough return on investment. That's the strategy we have. But we have to convince the client so that they understand this approach.

What's my experience with pricing, setup cost, and licensing?

The problem is that the cost of this tool is pretty high. Even if an organization likes the tool, at times it becomes difficult for us to sell the license. CA provides licenses for different utilities like masking but even if you break it up, the pricing is still high.

Which other solutions did I evaluate?

IBM Optim is one competitor as is Informatica. IBM has come up with the synthetic data feature in the last years although I don't recall the name of the tool they acquired. Informatica vs IBM Optim does not provide synthetic data yet.

Normal TDM features, like masking, are provided by both IBM and Informatica. People usually go for Informatica because it is easier for them to adopt the tool. Informatica is a very popular tool on the market for basic TDM-related activities and it's not as costly as TDM.

What other advice do I have?

I have been acquainted with this tool for three-and-a-half years and, since it was acquired by CA, we have worked pretty closely with CA to give feedback on what is expected out of the tool. We have worked very closely with the developers, as well, to enhance the tool.

We have two or three clients using it.

Disclosure: My company has a business relationship with this vendor other than being a customer: Reseller.
PeerSpot user
Buyer's Guide
Broadcom Test Data Manager
December 2024
Learn what your peers think about Broadcom Test Data Manager. Get advice and tips from experienced pros sharing their opinions. Updated: December 2024.
824,067 professionals have used our research since 2012.
it_user796329 - PeerSpot reviewer
IT Manager at The Williams Companies, Inc.
Video Review
Real User
Allows us to find the right test data and to get required inputs into our API test
Pros and Cons
  • "TDM allows us to find the right test data for the test that we need, and then it also allows us to get the required data inputs into our API test, so that we can do a full test."
  • "​The scalability is outstanding. We're able to scale it to any size of data that we want. We can do small data sets, we can do large data sets."

    What is our primary use case?

    One thing that we're using Test Data Manager for, is to build data marks so that we can test APIs of our application using users from every company within our application.

    How has it helped my organization?

    The benefits are that TDM allows us to find the right test data for the test that we need, and then it also allows us to get the required data inputs into our API test, so that we can do a full test.

    What needs improvement?

    One of the features that I wanted, which I think is going to be released, is to be able to create virtualized data sets, or virtualized databases. That's a feature we're going to take advantage of. All of our developers will be able to have their own virtual copy of a golden copy of our database, and be able to do transactions against their virtual copy, and then restore back to a known good checkpoint.

    What do I think about the stability of the solution?

    This solution has been very stable for us. We've gone through multiple upgrades of versioning, and each one of them gets progressively better. 

    What do I think about the scalability of the solution?

    The scalability is outstanding. We're able to scale it to any size of data that we want. We can do small data sets, we can do large data sets.

    How are customer service and technical support?

    On many occasions, we have sought CA's technical team to help us solve problems, and they've always been very responsive. A good relationship.

    Which solution did I use previously and why did I switch?

    Our team made the decision that we were going to get into DevOps and do test automation. As a way of providing our API test adequate data, we knew we needed to have a better solution than manually collecting data from databases. So we brought in Test Data Manager to work in conjunction with our app test.

    What other advice do I have?

    If I were talking to my peer managers, I would recommend Test Data Manager - and I have, on multiple occasions - because it does allow the developer to have quick access to data that, normally, would take them hours or sometimes days to gather. 

    I would say TDM, on a scale of one to 10, is probably in the eight category. It's a very solid solution. I think it can do more for us, and we're always trying to find new ways of using Test Data Manager.

    Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    PeerSpot user
    PeerSpot user
    IT Specialist at a financial services firm with 1,001-5,000 employees
    Real User
    Leaderboard
    Masks and generates data while obeying the relationships in our relational databases
    Pros and Cons
    • "The data generation is one of the most valuable features because we are able to write a lot of rules. We have some specific rules here in Turkey, for example, Turkish ID IBAN codes for banks."
    • "There are different modules for masking. There is a portal and there is a standalone application as well. The standalone application is more old-fashioned. When you write rules on this old-fashioned interface, because it has more complex functions available for use, you can't migrate them to the portal."

    What is our primary use case?

    We use it for data generation, for performance testing, and other test cases. We also use data masking and data profiling for functional testing. Data masking is one of the important aims in our procurement of this tool because we have some sensitive data in production. We have to mask it to use it in a testing environment. Our real concern is masking and we are learning about this subject.

    How has it helped my organization?

    CA TDM is valuable for us because we use relational databases where it's problematic to sustain the relationships, foreign keys, and indexes. TDM obeys all the relationships and does the masking and data generation according to those relationships. 

    Also, the testing team is using TDM to write the rules. Using this tool, our knowledge of data discovery skills has increased. That is an advance for our company.

    In terms of performance testing, before TDM, preparing the data and data generation took a week for 20,000 sets of data. Now, with TDM, it takes just one day, which is great. We haven't had much experience with masking yet, we are in the adaptation phase, but data generation has increased our performance by about 60 percent.

    What is most valuable?

    The tool has strong data generation functions. When we needed special function that is not in the list, the support team has generated these functions and added with patches in a limited time frame.

    For performance testing, we needed large amounts of data. The effort for data generation for this purpose has also decreased specifically.

    Depending on security politicies and regulations we have to obey, we needed masked production data for testing. With the help of this tool, considering data integrity we can mask the data in a variety of ways (like shuffling, using seed list, using functions etc.)

    What needs improvement?

    There are different modules for masking. There is a portal and there is a standalone application as well. The standalone application is more old-fashioned. When you write rules on this old-fashioned interface, because it has more complex functions available for use, you can't migrate them to the portal. 

    We also have some security policies in our company that needed adaptation. For example, the people writing the rules would see all the production data, which is a large problem for us. It would be helpful if there was an increase in the ability to apply security policies.

    For how long have I used the solution?

    One to three years.

    What do I think about the stability of the solution?

    The tool is stable. This was one of the reasons that we chose it. We haven't had an issue with any unknown problems or issues, so it has paid off.

    What do I think about the scalability of the solution?

    Scalability is a matter of how you use your systems. Our requirements required using it for MS SQL Server, Db2, and LUW Db2. We scaled the tool with all the databases we have, so it's scalable.

    How are customer service and technical support?

    Technical support is okay. We haven't had many issues lately, but we had a bug at the proof of concept stage and they solved it.

    Which solution did I use previously and why did I switch?

    We did not have a previous solution.

    How was the initial setup?

    The initial setup was straightforward. One of CA's consultants came to our company and did the installation in about two days. We use mainframes here, and mainframes are very complex. Still, the consultant did it in two days.

    What about the implementation team?

    We worked with a CA consultant to do all the adaptation over the course of about two months. We were happy with him.

    What's my experience with pricing, setup cost, and licensing?

    Part of the licensing is dependent on whether you want to use the portal. It's based on floating users. The other part is dependent on what type of system you are using. We are using mainframe, so we paid good money for a mainframe license. It's okay because, for us, the main work of this tool is on those systems. The mainframe is a critical system, so the cost is okay.

    Which other solutions did I evaluate?

    We looked at IBM Optim and Informatica TDM.

    What other advice do I have?

    It's important to know the requirements of your system, for example, the security policies you have to observe. The requirements may include a concern about relational or other database systems. You have to know your systems. Depending on your system, consider using one or more consultants, because we had a problem just using one. Also, compare all the tools by doing proofs of concept. That's important.

    We have been using it for three months, but before that we also did a proof of concept in stages for about a year.

    Regarding future use, we plan to use it in automation testing with content integration tools. Before running the automated tests, we will prepare our generated data with TDM. We also have a future plan for storage virtualization and use of Docker applications. It is possible that for Docker we would also use the TDM rule set. I want to believe it's scalable.

    We have five testers using it to write rules. We also have 20 business analysts using and running these rules. In terms of maintenance, two developers would be enough. Our consultant coached our developers regarding our requirements. A testing engineer would also be okay for maintenance.

    Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    PeerSpot user
    PeerSpot user
    Senior Project Manager /Senior Solution Architect at Cognizant
    Real User
    Data privatization, provisioning, and generation for DevOps and CI/CD pipeline
    Pros and Cons
    • "Data privatization (GDPR enable), synthetic test data generation, and test data provisioning are its main interesting features."
    • "Needs improvement on SAP test data generation for SAP testing."

    What is our primary use case?

    Test Data Management solution for our DevOps model, which is very useful. Data privatization (GDPR enable), synthetic test data generation, and test data provisioning are its main interesting features.

    How has it helped my organization?

    On-time production and real time data for DevOps testing environment and CI/CD pipeline.

    What is most valuable?

    Data privatization, provisioning, and generation for DevOps and CI/CD pipeline. 

    What needs improvement?

    • More features on Big Data environment data privatization. 
    • Synthetic data generation on domain specific. 
    • SAP test data generation for SAP testing.

    For how long have I used the solution?

    Still implementing.
    Disclosure: I am a real user, and this review is based on my own experience and opinions.
    PeerSpot user
    Network Engineer at a financial services firm with 1,001-5,000 employees
    Real User
    Leaderboard
    It scales very well to our network and we have a very large network

    What is our primary use case?

    Monitoring network devices using SNMP. It works very well. 

    How has it helped my organization?

    • Scalability
    • The ability to have multiple pieces of information on the same screen. 

    What is most valuable?

    • The flexibility
    • The ability to view the data the way we want it. 

    What needs improvement?

    More data visualization, the way that we are looking at data, we want to be able to see it in different ways. So, we are looking to expand the visualization of that data.

    What do I think about the stability of the solution?

    It is very stable. We have had issues, but we have worked through those issues with CA, and they have been successfully resolved. 

    What do I think about the scalability of the solution?

    It scales very well to our network. We have a very large network. Finding a solution that can actually monitor all the devices and interfaces, this product has been able to do that.

    How are customer service and technical support?

    Technical support is very good. They have performed to our expectations.

    Which solution did I use previously and why did I switch?

    We were previously using a different solution, however CA purchased that solution.

    How was the initial setup?

    Due to our environment, it was complex. The product itself is simple. 

    Which other solutions did I evaluate?

    SevOne.

    What other advice do I have?

    I would recommend this solution.

    Most important criteria when selecting a vendor: 

    • Stability
    • The size of the company
    • The ability to respond to our needs and meet our needs. 
    • The breadth of software that they have available for what we are looking to do.
    Disclosure: I am a real user, and this review is based on my own experience and opinions.
    PeerSpot user
    it_user466854 - PeerSpot reviewer
    Practice Leader - DevOps at CIBER
    Consultant
    We use it to assist our clients with data privacy and the regulatory recommendations.

    What is most valuable?

    The most valuable features for us are masking, data profiling, and creating data subsets. More specifically, we are able to assist our clients with data privacy and the regulatory recommendations that come from the government. We help them to comply with PI, IP, HI and PCI regulations.

    How has it helped my organization?

    CA Test Data Manager is enormously helpful to us. We assist our customers by speeding up the application development process using real-time test data and synthetic test data, which mimics the real test data.

    What needs improvement?

    Integration

    What do I think about the stability of the solution?

    CA Test Data Manager is pretty stable, but integration is where we are looking for some improvements.

    What do I think about the scalability of the solution?

    It is fairly scalable for the implementations I've participated in. We haven't yet utilized the current available capacity.

    How are customer service and technical support?

    I would give technical support 8/10. Generally, we get a solution to an issue, but we have to go through multiple iterations before we get a complete resolution.

    Which solution did I use previously and why did I switch?

    Previous to implementing Test Data Manager all our work was done manually. We used custom SQL scripts, but because of ICD regulatory recommendations, we switched to Test Data Manager.

    How was the initial setup?

    Initial setup was complex in comparison to other solutions for which we did proof-of-concept. There are a lot of contact points with the TDM suite, which I personally felt increased the complexity.

    Which other solutions did I evaluate?

    We evaluated Delphix and IBM, as well as CA Test Data Manager. One of the reasons we chose CA, aside from the fact that we are CA partners, is due to support for PCI and PHI in terms of faster test data generation. The biggest differentiation was in generating test cases from the data. CA implemented this for test matching and then integrated it with Agile Requirements Designer. That tipped the scales in favor of CA TDM.

    When choosing a vendor, we look for continuous innovation and continued support. Continuous innovation can release features into the market ahead of other vendors. So that's something we always look for.

    What other advice do I have?

    My recommendation is to perform a detailed evaluation. If only simple, straightforward, and small-scale test data management is needed, I don’t think a large solution such as CA TDM is necessary. To justify the cost of CA TDM, you need to have need for large-scale test data management.

    Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    PeerSpot user
    it_user542772 - PeerSpot reviewer
    COE Consultant Test at a financial services firm with 10,001+ employees
    Consultant
    It has removed the dependency of making production data available for development and testing activities.

    What is most valuable?

    Data masking and synthetic data generation.

    How has it helped my organization?

    By using this product we are able to provide test data for the development and testing teams. It has removed the dependency of making production data available for development and testing activities. Using data masking techniques we can comply the rule of non-disclosure of personally identifiable information

    What needs improvement?

    Automating repetitive tasks.

    For how long have I used the solution?

    More than one year.

    What was my experience with deployment of the solution?

    By using different functionalities of CA Test Data Manager, we were able to mask and deploy the data very easily to various environments.

    What do I think about the stability of the solution?

    Product is quite stable but it has some functional bugs which are fixed as soon as they are reported to the support team.

    What do I think about the scalability of the solution?

    Yes, the product is quite helpful to suffice our data masking and synthetic data generation requirement.

    How are customer service and technical support?

    Customer Service:

    8 out of 10

    Technical Support:

    9 out of 10, all our queries and functional defects were resolved within very little time. CA technical support people are proactive and we get the fixes in very little time.

    Which solution did I use previously and why did I switch?

    No, we didn’t use any other tool beforehand.

    How was the initial setup?

    Initial setup was pretty straightforward. We just require a license and a native server, after installation the product will be available for all users under the server.

    What about the implementation team?

    We implemented this in-house.

    Which other solutions did I evaluate?

    No, we didn’t evaluate other options. We had researched this tool and then chose it for our requirements.

    Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
    PeerSpot user
    Buyer's Guide
    Download our free Broadcom Test Data Manager Report and get advice and tips from experienced pros sharing their opinions.
    Updated: December 2024
    Buyer's Guide
    Download our free Broadcom Test Data Manager Report and get advice and tips from experienced pros sharing their opinions.