Essays.club - Get Free Essays and Term Papers
Search

Literature Review - in Memory Technology

Autor:   •  March 6, 2018  •  2,290 Words (10 Pages)  •  650 Views

Page 1 of 10

...

The key difference between traditional hard disk based query approach and in-memory based query approach has been illustrated in the diagram below. (Read, 2013)

[pic 2]

A different approach to using in-memory technology is to have multi-dimensional storage of information rather than loading existing structures into memory. This method provides great analytical capabilities with better performance which when coupled with write-back mechanisms opens up the potential for “what-if” analysis on data enabling business forecasting possibilities. (Read, 2013)

In-memory technology can be deployed at a platform level and at an end-user desktop level. A desktop level implementation will be limited by the main memory resources available on the user’s desktop and would cater for simple analysis and less complex ad-hoc reporting. A platform level or server level implementation will provide the server resources to handle larger data sets to a wide range of users but will be limited by the network latency and server resources in case of data requests. (Read, 2013)

By showcasing better performance and reduced maintenance effort, it is not difficult to sell in-memory technology to a customer, but we need to analyze whether in-memory technology is actually the right solution indeed. The factors to be considered are listed below:

- Companies with less data volumes and minimal reporting requirements do not need to employ in-memory technology.

- In-memory technology can provide capabilities above 4GB only with a 64-bit IT infrastructure. If the organization has not yet implemented a 64-bit IT architecture, it is not advisable to go for this technology. (Read, 2013)

- High Availability, Disaster Recovery, Data Backup and Recovery needs to be implemented for the new infrastructure which may be a compliance requirement for some companies and hence becomes a deciding factor. (Read, 2013)

- The in-memory software solution may cost up to 10 times more than the IT infrastructure (DRAM, main memory) needed to run this solution and helps decide the need for this technology. (Read, 2013)

- For example, Scheduled Reports which are sent out to Business Users do not need high response times provided by in-memory technology. The same is the case of Annual or Quarterly reports that are done well in advance and is not an adhoc-reporting process. (Read, 2013)

The need for in-memory technology can be based on the following factors (RFM) from an IT perspective:

- Recency - How recently was the business information data source updated?

If low recency is acceptable, in-memory technology will be a suitable solution. (Read, 2013)

- Frequency - How frequently is the data needed to make business decisions?

If high frequency is required, in-memory technology will be a suitable solution. (Read, 2013)

- Monetary - What is the financial risk associated if data is not available on time?

If the level of risk is acceptable, in-memory technology will be a suitable solution. (Read, 2013)

Similarly, the need for in-memory technology can be based on the following factors from a business perspective:

- Customer Insight

- Performance Scorecard

- Planning and Forecasting Activities

However, future analytical solutions should be able to leverage the full potential of disk based technologies and memory based technologies to support business decision making. (Read, 2013)

Paper – 3 – Shifting the BI Paradigm with In Memory Database Technologies

Many vendors have adopted the in-memory technology to build better performing BI tools with attractive analytical capabilities. The advent of 64-bit technology and affordability of RAM have made way for better implementation of in-memory technology thereby enhancing the insight possible on business information or data ultimately creating a shift in terms of BI capability. (Gill, 2007)

Database technologies started with databases residing on files/file-systems. Data was stored sequentially and also accessed or retrieved on a sequential basis and hence random querying ability was limited. (Gill, 2007)

Later in the 1980’s, the RDBMS was introduced which allowed data to be stored in tables and with the ability to link data in various tables using a unique key or reference key. The RDBMS was a transactional database (OLTP – Online Transactional Processing) and provided less analytical capability for queries on the underlying data. (Gill, 2007)

This led to the introduction of cubes and data warehousing with query optimization techniques which was termed as OLAP (Online Analytical Processing) and this was available on a separate DB instance to avoid transaction level database locks due to complex queries. This was the basis for BI tools such as Oracle Hyperion, IBM Cognos and SAS (Statistical Analysis System). (Gill, 2007)

Today’s modern technology brings about the in-memory based databases residing on main memory with improved performance and better analytical capabilities. In-memory databases can handle data in the range of 200 GB to 600 GB with adequate compression techniques with compression factors of up to 12X (12 times), thereby being able to reside on minimal RAM storage. In-memory applications are also equipped with features to pull data from the various traditional databases as well. (Gill, 2007)

The main key players in this space are:

- Oracle – Product : Times Ten

- SAP – Product : BI Accelerator

- QlikTech – Product : QlikView

In-memory technology has been influenced by the following factors:

- Affordable and better – performing hardware

- Introduction of 64-bit processors and operating systems

- Need for more information at a single time

- Need for a cheaper, faster and better BI and analytical capability

(Gill, 2007)

Research Gaps Identified

...

Download:   txt (15.6 Kb)   pdf (63.5 Kb)   docx (19.9 Kb)  
Continue for 9 more pages »
Only available on Essays.club