How to Evaluate an Embedded Analytics Solution

How to evaluate an embedded analytics solution

Izenda’s software product team developed an evaluation process using the knowledge gained from more than 2,000 unique application integrations. With this guide, we want to give your organization the information needed to make the right choice for embedding analytics.

Software companies can’t do it alone in the evaluation process. Their teams need information and help from the vendors they want to evaluate. Izenda needs information from their organizations to deliver a top notch level of service during this evaluation and on to successful deployment.

Evaluation Process Overview

The evaluation process includes three phases: Scope, Prepare and Evaluate. As in any software project, the software company and the analytics vendor must determine its scope.  Both firms have preparations for the evaluation, such as providing documentation and code samples. Finally, the proper evaluation begins, with the software company validating the functionality of the embedded analytics solution, gaining an understanding of the integration process and down to a review of the OEM agreement before a purchase can be made.

How Long Will it Take?

The evaluation process will require the combined efforts of the project sponsor, business analyst and lead developer. Expect to spend 20 to 60 hours over a 2 to 4 week period to complete your evaluation, with your analytics vendor being involved in various stages. Specifically how long depends on the status of your own application project and its requirements, plus how many features of Izenda’s embedded analytics solution will be offered in your application upon its initial release. The project sponsor and developer can expect to spend at least eight hours on the evaluation. The business analyst’s evaluation time depends on how deep into report and dashboard design your software company needs to address.

Taking a Closer Look at the Project’s Scope

Before anything else comes a demonstration of Izenda’s embedded analytics solution. Software companies need to know what the analytics solution can do, how it works and its basic architecture. Once an Izenda solutions architect leads them through the demonstration, the software organization will want to learn more in a technical deep dive. The demo and technical examination prepare the software product team and Izenda for evaluation scoping.

Izenda’s integration engineers need to know how your software application works as well. After a demonstration of the application, product teams from both companies will discuss its architecture. This is the time to set up detailed functional and technical requirements, with Izenda’s team sharing what’s needed to meet those requirements.

With these succinct requirements in hand, the two organizations can set timelines and deliverables in the evaluation and integration process. How can they make this happen? Well, your company needs to set a clear decision-making process that’s understood by all parties. This can only happen if you designate who will be involved in the evaluation process.

Many organizations slip up at this point by forgetting to include key personnel on the evaluation team. We’re all dealing with data, so it seems obvious that someone knowledgeable about your databases needs to be part of the evaluation team. That might be your DBA or a business analyst familiar with the data.

You’ll also need to include a key stakeholder in the evaluation process, and hopefully, this person is the chief supporter of the embedded analytics project so it doesn’t get bogged down.

Before moving on, now that we’ve collectively identified your evaluation team and their Izenda counterparts, we’ll need to agree on the project’s success criteria. This develops from your original goals for embedding analytics into your application. What are you trying to accomplish? Did you just want to add ad hoc reporting? Is report and dashboard customization important to your customers? Will end users need access to multiple databases? By understanding these goals, we can set the proper success criteria.

Preparing to Evaluate the Embedded Analytics Solution

Your evaluation team needs more than memories of the Izenda demo and your notes from the technical deep dive to do a proper evaluation. To that end, Izenda provides documentation, tools and guides to facilitate the process.

Explore the Documentation – Izenda will provide API documentation, code samples, access to product videos and end-user guides for your examination.

To complete a functional evaluation, Izenda provides a standalone kit. (This is a good point to remind you of our Self-Service Analytics Portal, which allows you to get your application to market quickly with our white-labeled analytics solution. Your software product team then gets the time they need to phase in a fully embedded Izenda analytics solution.)

In addition to the functional evaluation, your team needs to complete an embedded evaluation, and we’ll supply them with an HTML kit to proceed. To continue the evaluation will require an integration evaluation. We will provide sample integration kits including MVC, Angular, etc.

Evaluation of Embedded Analytics and its Functionality

Evaluation teams get hands on to learn if a software product meets their requirements, which is no different for Izenda embedded analytics. To validate Izenda’s functionality, they’ll need to create 3 to 5 of the core reports your customers need. Dashboard design can proceed using these reports. After assessing the analytics solution’s performance, it’s time to move on to understanding how it fits with your application.

Five key topics come under consideration for the evaluation team to understand integration, from front-end embedding to security integration:

Front-End Embedding – Izenda’s modern, 3-tier embedded architecture makes it possible for almost any application written using any programming language or framework to embed analytics.

Branding/Styling – Learn how Izenda enables white-label branding to let you give your customers the look and feel of your application so they explore data in the context of your business application and at the point of decision.

Data Modeling – Does it provide an easy means to alias database fields with business relevant terminology? Is the administrative UI user-friendly enough to enable non-technical staff to do this? Izenda’s administrative interface lets business analysts and other non-technical users connect to data sources for reporting. Using the UI enables them to:

• provision, join and secure data sources
• alias data source and field names
• manage tables, views and stored procedures
• create calculated fields

API Interactions – An exposed REST API makes automation and extension straightforward. This gives developers who prefer to work in code or want to customize further an open and well-documented API.

Security integration – Unlike many vendors that require a separate security structure for the application (or applications) that make up their analytics solution, Izenda inherits your application’s security model to control access to data, set user roles and permissions.

Sharing Evaluation with Stakeholders

Once your team completes their evaluation, it’s time to demonstrate to key stakeholders how embedded analytics work with your application. Your team will share what they’ve learned, what they like, what steps they’d like Izenda to take and what timeline they see as possible for analytics integration. Izenda can answer questions for these stakeholders as well in support of the evaluation process.

For the next step, your team will review the OEM agreement and Izenda’s licensing. Once you have the approval from key stakeholders and the appropriate decision maker, purchase of Izenda embedded analytics proceeds. All that’s left is integration of embedded analytics with your application.


Leave a Reply