Data Governance: Understanding the Value of Data

By | Big Data, Business

Garbage in, garbage out has long been the cry of the programmer facing an unintelligible report or other output. In the past, users have relied on IT to make sure that the information they were using to make their decisions was the right data. IT was responsible for the storage, security and accessibility of that data. Times, they are a-changing.

More Data Means Less IT Involvement

In the past, the volume of data was manageable by IT departments. They had the time and the resources to do the data governance for the business intelligence groups. But as โ€œbig dataโ€ becomes more and more a part of the business decision-making process, it is quickly becoming too much for IT departments alone to handle. This leaves business users with more responsibility for their data. And that is where data governance processes become essential to the enterprise. Read More

A Gimlet-Eyed View of Software as a Service

By | BI Innovation, Big Data, Technology

In computing days of yore, a company would buy a license for software and install it in their computer center, using vendor support and a team of systems programmers to get it up and running. In todayโ€™s Software as a Service paradigm, things work a little differently. There are clear benefits to SaaS, but also some important disadvantages to consider before moving to a SaaS application.

SaaS Benefits

The most obvious benefit is the IT cost savings realized when there is no longer a need to invest in data center and hardware infrastructure. A recent study by Computer Economics of seven businesses that employ cloud services found an average 15% savings. Lower upfront costs makes SaaS especially attractive for new businesses, which can start up and scale up faster and cheaper. And speed is another crucial benefit. Maintaining a service contract with a cloud provider means there is no time required for an infrastructure implementation and no need to hire and pay an IT staff to support it.

Read More

Microsoft Debuts New NoSQL Features

By | Big Data, Microsoft

Big data graphic of clouds and keywordsWhen one hears the words โ€œMicrosoft databaseโ€ SQL Server usually comes to mind. But the company recently announced two new features that expand support for NoSQL on its Azure cloud platform. Azure DocumentDB is a NoSQL database service that promises to unite the processing and transactional capabilities of a relational database with the flexibility of a NoSQL database. DocumentDB is highly scalable, efficiently indexes heterogeneous documents without requiring schema or secondary indices, and allows for rich query and transaction processing capabilities.

Read More

Embedded BI & Visual Analytics Provider Izenda Secures $3M

By | BI Innovation, Big Data, News & Events, Technology

Ethos Partners and Hawthorne Capital Raise Growth Equity Funding For Expansion

Izenda logo - Embedded self-service business intelligenceATLANTA, GA (August 7 2014) โ€“ Embedded BI, visual, analytics and self-service reporting software provider Izenda (http://www.izenda.com) has raised $3 million in a growth equity investment round led by Ethos Capital Partners and Hawthorne Capital. Izenda provides business intelligence visualization software for data discovery via an HTML5 model versus a developer-oriented, desktop-based environment. Current customers include Oracle, Infor, CDC, Volvo, US Navy, and Thomson Reuters.

โ€œWe are delighted to be backing Izenda in its continual efforts to accelerate and improve the highly accepted suite of BI and reporting products,โ€ said Dr. Mark Bell, Partner at Ethos Capital. โ€œOur customer interactions confirmed that Izenda provides an experience that is ahead of the trend of placing decision-making tools into the hands of decision makers.โ€

Read More

The BI Implications of Highly Customizable Data

By | BI Innovation, Big Data, Tips

As seen onย TDWI.org

Although storage prices are dropping, you must still consider how your customized data set will work for self-service or ad hoc reporting, especially in a real-time environment.

Modern DBMS systems were designed in an era where someone was in charge of all data structures. A DBA, or sometimes a small committee of them, dictated what could be stored and how it should be structured.

At the time, data storage was unbelievably expensive by today’s standards. Imagine oil beingย $10,000 a barrel, and consider how carefully it might be controlled and utilized. In a world where billion-dollar companies could not even store as much data as you have on your cell phone, relational data structures represented efficient ways to store various types of data such as strings, dates and numbers in a structured way that enabled quick look-up through indexing.

Read More

How Data Virtualization Delivers Access to Real-Time Data

By | BI Innovation, Big Data, Technology, The Cloud

As seen on TDWI.org

With so much data in so many places, how can you quickly connect to the sources you need? Data virtualization may be the answer.

In a world where yesterday’s data is like yesterday’s news and users are accustomed to finding out “what’s happening right now” via social media drive-by platforms such as Twitter, virtualization is quickly becoming the ideal data management strategy.

Traditional data integration requires building a data warehouse and moving enterprise data to it on a periodic basis. Today’s modern data virtualization capabilities connect directly to live databases and pull information “just in time.” This solution delivers real-time results without the costs and delays of a complex ETL (extract, transform, and load) project.ย Business users now accustomed to instant real-time access to data in their consumer lives expect the same experience from their enterprise systems. Generation Y employees who have accessed the Internet from an early age are often surprised that ETL technologies are used to move data to a data warehouse before it can be analyzed, so business reports are delivering yesterday’s data.

Read More

Is SQL the x86 of Business?

By | Big Data, Technology

The x86 instruction set is the life blood of modern IT.ย While it is easy to criticize for its lack of elegance and archaic nature, the very low end (mobile devices) and the very high end (supercomputers) seem to be the only place x86 is not overwhelmingly dominant.ย Many other platforms like Spark, MPS and even Intel’s own Itanium are no longer viable options for the overwhelming majority of applications.ย The platform is very prominent in the very high end and is making inroads into the mobile market currently dominated by ARM.ย Apple, a long time PowerPC company, recently shifted to x86 and with enormous success.

Is x86 the best architecture? Nope.ย The fastest or the most power efficient?ย Not even close.ย The great x86 advantage is that trillions of dollars have been invested into it in terms of software, hardware, expertise and intellectual capital.ย A platform shift today would require more investment than the government bailout of Wall Street.

A decade ago the industry did a lot of soul searching in terms of figuring out what’s important in the server world.ย Was it 64-bit? low power? High IPC?ย ย Lots of threads?ย Bandwidth?ย Megahertz?ย Emulation? In that search new architectures emerged like Itanium and NetBurst (which was x86 based) that just didn’t make it.ย Companies like Intel, HP and Sun invested billions of dollars creating processor platforms that are not competitive with 64-bit multi-core x86.

Where SQL Came From

SQL was designed for computing needs that emerged more than a quarter century ago.ย The hard disk became the ideal way to store large amounts of data and has held that position to date.ย Putting data on spinning platters rather than random access memory created many very complex but ultimately solvable problems. Financial and accounting applications dominated the computing landscape and things like transactional integrity and ACID compliance were crucial.ย A young guy named Larry figured out how to monetize this and became one of the richest men in the world.

Database Needs Today

The types of problems dominating database science today are very different.ย Today’s applications like Google, Twitter and Facebook simply cannot function on the standard relational model on any single server that exists today.ย And while we still have spinning platters, a lot of stuff can now fit in memory.ย The number of users is now astronomical by 20th century standards.ย While a large bank may have tens of millions of customers, they will perform very few database operations in a day.ย Most of these are automated and may only interface with the system a few times a month for very short periods of time.

Contrast this with an application like Facebook.ย There will soon be hundreds of millions of users, many of which spend several hours a day on the site.ย What’s worse is that database operations can involve many parties.ย When you process a payment it’s between your bank and one other party.ย When you post to Facebook, every one of your 10,000+ “friends” may need to be updated.ย This is profoundly beyond the scope and scale of the way SQL has traditionally operated.

New Ways of Thinking

While most of these new applications still have SQL components, the bulk of data exchange is done though new techniques that would be considered exotic and even blasphemous by SQL standards.ย Architectures like SimpleDB, BigTable and Astoria do many things in fundamentally different ways. By sending simple, state-less and sometimes structure-less objects through the web as XML, these platforms solve the basic Create, Retrieve, Update and Delete (CRUD) needs of an application.ย While this works for day to day updates, a rich enterprise application has needs way beyond this.

The Demand for Strawberry Pop-Tarts

It turns out that demand for strawberry pop-tarts skyrockets right before hurricanes.ย  Wal-Mart, North America’s largest employer and one of the few companies thriving in the current economy, not only knows this but aggressively does something about it.ย The real power behind SQL is not so much its ability to store structured data, but how it can transform business and provide insight into customer behavior over time and across many transactions.ย ย While you could say that such analysis is done with cubes engines or BI tools, they are fundamentally based on relational databases.

The Problem with Data Warehouses

The BI industry has long relied on data marts or data warehouses to solve many problems in order to speed up queries and avoid affecting transactional performance. A data warehouse uses an Extract, Transform and Load (ETL) process to effectively make a copy of production data.ย The data is often reduced, cleaned up and optimized but essentially there is a copy process that usually happens overnight.ย Social, retail or news events at a much faster pace.ย A retailer can’t wait until midnight to know what is selling on Black Friday at noon on the east coast.

The Solution is SQL

While the SQL language does not do some things, like forecasting, it provides the foundation of what modern businesses needs to operate in a competitive environment.ย This is especially true in the increasingly global and highly dynamic times we are in today.ย SQL has served us well and needs to shed some remnants of an earlier era, but will allow us to leverage the trillions that have been poured into SQL-based technologies.ย Like x86, it has its rough edges but serves us well for many situations, so much so that all of the advanced features of SQL will need to be supported in future cloud-based systems.