Review of Major Challenges in Data Analysis of Big- Data
Journal Title: International Journal of Engineering Sciences & Research Technology - Year 2014, Vol 3, Issue 12
Abstract
“Big Data” is a term encompassing the use of techniques to capture, process, analyze and visualize potentially large datasets in a reasonable timeframe not accessible to standard IT technologies. By extension, the platform, tools and software used for this purpose are collectively called “Big Data technologies”. Heterogeneity, scale, timeliness, complexity, and privacy problems with Big Data impede progress at all phases of the pipeline that can create value from data. The problems start right away during data acquisition, when the data tsunami requires us to make decisions, currently in an ad hoc manner, about what data to keep and what to discard, and how to store what we keep reliably with the right metadata. Much data today is not natively in structured format; for example, tweets and blogs are weakly structured pieces of text, while images and video are structured for storage and display, but not for semantic content and search: transforming such content into a structured format for later analysis is a major challenge. The value of data explodes when it can be linked with other data, thus data integration is a major creator of value. Since most data is directly generated in digital format today, we have the opportunity and the challenge both to influence the creation to facilitate later linkage and to automatically link previously created data. Data analysis, organization, retrieval, and modeling are other foundational challenges.
PROPOSED SRAM CELL USING LOW POWER SCHMITT TRIGGER IN SUBTHRESHOLD REGION WHICH ADAPTS ITS OWN THRESHOLD
In 1950, an American biophysicist and polymath Otto Schmitt coined the term ‘Biomimetics’ (imitation of nature element, systems, models for the purpose of solving complex problems of the human). Proceeding on the...
Multi-Step Verification Environment for a Chip Design using SoC platform
This paper presented an efficient verification strategy for the platform based design. A goal of the verification task is to detect all design faults and provide with full verification coverage at the earlier desi...
Non Linear Adaptive Filter for Interference Suppression
Radio Frequency (RF) interference is inherent in all wireless systems and is one of the most significant design parameters of cellular and other mobile systems. In this paper, it is shown that how a non-linear ada...
A Study on Automatic Test Case Generation
Software testing is a process of evaluating a software item to detect the difference between given input and expected output. Software testing is one of the important step in software development lifecycle. Testin...
ANALYSIS OF NOME (NEEM OIL METHYL ESTER) FOR BIODIESEL GENERATION
Biodiesel derived from the neem oil has been proved as an alternative fuel sources. Since neem is most common edible and medicinal plant could be grown at any places in India, it was selected for the biodiesel pur...