We use this solution for ETL, Analytics, Reporting, Automation, Integration between data sources (cloud, local, spreadsheets, etc), Data Scrub for Migration to new software, and automation of sending files to banks instead of using EDI, Positive Pay, etc.
This solution has simplified getting to and understanding our data, no matter where it is housed.
The most valuable features are the ETL, automation, and ease of use by the user population.
This solution would be improved with the inclusion of a feature that would allow us to add a common library of (our) commands used in load scripts and expressions, so with a keyword, we would get a drop down to select the command we are looking for, as opposed to the generic help. I keep them all in OneNote now and copy and paste them in, then change the field names. This addition would really save time.
examples:
Convert date-time stamp to just a normal date in the Load script - so we can match to the company calendar for Model Year / month etc , Fiscal Year /month / week of year/ week in year
date(floor(DTTMSTAMP),'MM/DD/YYYY') as TransactionDate,
Storing QVD's: (this is the path needed from a Data Loader)
STORE SummarySalesAggregate INTO [..\QVDData\SummarySalesUnitsAndDollars.qvd];
Sorted and Count for creating Arrays:
SortedAndCounted:
Left join (SummarizeSales2)
LOAD
Option_Inv,
Line_No,
if(peek('Option_Inv',-1)<> Option_Inv,1, Peek('counter',-1)+1) as counter
Resident SummarizeSales2
Order by Option_Inv asc, Line_No asc;
Since May 2013.
I have never worked with Qlik Customer Service directly.
it wasn't bad. We did need some consulting advise on the set up in the server...but designing the organization and naming conventions should be up to you. We took the consultants advise and ended up renaming, changing almost everything as far as naming conventions
started with a consultant, didn't use them long.
keep working with the Company to get the pricing you need.
just used the REST connector for an API in the load script...worked excellent!