To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

Enabling Enterprise-Ready Analytics

The rapid growth of data that must be managed represents a daunting challenge that is pushing the limits of most organization’s existing IT infrastructure. Our joint Unisys and EMC Big Data solutions enable enterprise-ready analytics.

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

How We Can Help

Unisys and EMC provide comprehensive solutions to help organizations address their unique Big Data challenges. Our solutions combine EMC Isilon scale-out data storage with a comprehensive array of data analytics services from Unisys that are tailored to the specific needs of our customers.

Innovation Workshop

These workshops are typically half day to full day on-site sessions with selected stakeholders to provide perspective into the latest technological architecture and analytical algorithms to support Big Data. These workshops are customized to meet the specific needs of your organization.

The workshops provide guidance in the following areas: infrastructure, data processing and storage, data transformations/views, analytics, and visualization/business intelligence.

Information Capability Assessment

The Information Capability Assessment is typically two to four weeks and is an onsite assessment to understand our customer’s current state information assets and capabilities.

The outcome of the assessment provides our customers with a current state analysis, which identifies gaps and provides guidance on improvements. Additionally a future state roadmap will be provided to help drive the organization’s progress forward.

Infrastructure Consulting

Our Infrastructure Consulting service provides guidance and helps you develop an infrastructure strategy to address your Big Data environment and organizational goals. Data storage hardware, as well as data management and protection requirements will be assessed.

Your Unisys and EMC team can help you design a Big Data solution that compliments your existing IT infrastructure investment while providing the scalability to meet growing data needs. Infrastructure Consulting engagements typically range from thirty to sixty days.

Data Analytics Strategy

The Data Analytics Strategy effort is typically thirty days but can go up to sixty days. It provides consultative expertise to develop an analytical strategy within the organization. Customers will gain insight on key initiatives required to drive data analytics to action and how to increase their overall analytic maturity.

Data Analytics as a Service

Unisys implements and maintains the technical infrastructure, hardware, software, and analytical services to customers. This service allows clients to store equipment onsite or offsite at a secure Unisys location. Within this offering, the entire hardware and software stack will be maintained by Unisys and allow customers to use and pay for only what they need.

Proof of Concept/Pilot

This is a quick thirty-day pilot to show key analytics concepts and creation of data products. We use small data sets, provide the modeling, and present the business perspective to make critical decisions. Working together with EMC, Unisys provides an agile and quick way to use analytics to increase efficiencies.

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact
Close

Efficient Data Processing

Challenges

  • Ineffective/inefficient storage and security platforms
  • The need to support information sharing initiatives
  • Complex existing applications and legacy structures

Objectives

  • Process data fast enough to make use of it in analysis or support Service Level Agreements
  • Access and integrate data across multiple formats and structures
  • Represent data in a uniform way and expose that to applications
  • Respond to changing business needs and new data requirements

Powered by EMC Technology:

EMC is the first and only storage vendor to provide native Hadoop integration with scale-out storage. Our solution is designed to deliver a number of key benefits:

  • Flexible storage infrastructure – EMC Isilon Scale-out NAS provides a highly scalable Hadoop platform that easily leverages other enterprise applications and workflows.
  • Highly resilient storage – Eliminates Hadoop “single-point-of-failure” issue by enabling all nodes in an EMC Isilon storage cluster to become NameNodes.
  • Unmatched efficiency – With Isilon solutions, customers can achieve up to eighty percent or more storage utilization, which reduces costs and simplifies management.
  • End-to-end data protection – The EMC Isilon solution provides reliable, end-to-end data protection including snapshoting for backup and recovery and data replication for disaster recovery capabilities (which are not available for Hadoop on DAS).
  • Massive scalability – EMC Isilon storage scales easily and as more storage capacity, performance increases linearly. Our solutions scale to over fifteen petabytes in a single cluster. Customers can scale their compute and storage resources independently.
  • Multi-protocol support – Our native Hadoop integration enables use of Hadoop data and analytics together with other enterprise applications and workflows while eliminating the need to manually move data around (as required with direct-attached storage implementations).

Case Study:

Challenge

The client had a project-based procurement method for addressing data storage requirements, which resulted in numerous silos of data storage equipment with a limited ability to pool resources. The stove-piped acquisition, deployment and management processes were very costly to implement and did not allow other internal departments to share excess capacity. This resulted in inefficient operations across departments and a large annual capital expense to the agency.

Solution

The Unisys solution leveraged the existing storage assets of the client, while allowing pooling of data storage assets. Our private Storage Cloud solution employs IBM SAN Volume Controller, a storage virtualization technology, coupled with our patented architecture to provide a robust, scalable storage cloud infrastructure for the client’s open systems computing platforms. With our patented storage virtualization cloud architecture inserted between the servers and physical arrays, the client now has enhanced storage features.

Results: Cost-effective scalable storage

Rapid cost effective provisioning with capital savings; our approach unlocked the value that still remained in the client’s transferred storage assets, as they were virtualized and consolidated into a modernized architecture. The new architecture provides significant scalability, while allowing new storage cloud features.

more
Close

Effective Information Management

Challenges

  • In-accessible or siloed analytics (“Cylinders of Excellence”)
  • Misaligned IT, Analytics, and Business Strategies
  • Complex analytic requirements and information assets beyond existing analytic capabilities

Objectives

  • Get accurate real-time reports
  • Identify future trends and incorporate in decision making
  • Access to data needed to make informed decisions
  • Incorporate external data such as Crowdsourcing and social media
  • Agility to support rapid combination of analytic and data products decreasing time to market for analytics
  • Increased understanding of data assets and value

Case Study:

Challenge

The client, an agency within the US government, was facing major challenges with data transparency, timeliness, and quality.

Solution

Unisys created a metadata-driven framework that allowed the client to streamline submission processes with their vendors, banks, and other agencies. We have proposed eliminating high cost software licenses, virtualizing the environment and moving to AWS (Amazon Web Services) public cloud. We have also architected the system on the cloud for higher tolerance to failures, faster disaster recover, and an improved SLA performance.

Results: Faster data access with higher accuracy of results

  • The client has realized significant improvements in the information management process, including higher accuracy and rapid access to information that permits better deployment of client analysts.
  • Data asset quality gains from sixty-five to ninety-five percent of all data submissions and analyst efficiency improvements of roughly thirty percent.
more
Close

Expressive Analytics

Challenges

  • Time to market for analytics is too long hindering productivity
  • Untrusted analytic products or analytics that are not timely, accurate, or repeatable, nor tested
  • Inability to scale analytic generation largely due to a lack of training

Objectives

  • Make use of analytics in business processes and in developing new strategies
  • Offload mundane and routine decisions
  • Focus on analyzing results then focus on structuring data
  • Gain understanding of the value of the data

Case Study:

Challenge

Assist the agency to protect U.S. borders through risk assessment techniques by providing software technology solutions that support inspection and enforcement activities.

Solution

Enhance, administer, and maintain selectivity and targeting systems to help secure the supply chain and support strategies for international cargo and passengers.

Results: Superior mission-critical analytics

This project is a high priority mission-focus program for the agency. The targeting systems require ongoing operation, maintenance, and continuous development and refinement of analytics as threats change. Unisys’ superior performance and dedication to the client’s mission and our agile approach to analytics has increased the agency’s effectiveness and delivered operational efficiencies across the program. The solution supports the agency’s mission, adds agility to meet new threats, and facilitates continuous innovation.

more

Big Data Challenges

Building and constructing data analytic applications has been difficult for organizations because the combination of software, hardware and required skills is often lacking even in the most mature technical organization. The reason for this is that traditionally stove-piped disciplines must come together to build and construct a cohesive system. This type of team integration does not happen without a holistic process that brings technology and domain experts together.

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

Reference Architecture for a Data Analytics Platform

Close

Efficient Data Processing

Challenges

  • Ineffective/inefficient storage and security platforms
  • The need to support information sharing initiatives
  • Complex existing applications and legacy structures

Objectives

  • Process data fast enough to make use of it in analysis or support Service Level Agreements
  • Access and integrate data across multiple formats and structures
  • Represent data in a uniform way and expose that to applications
  • Respond to changing business needs and new data requirements

Powered by EMC Technology:

EMC is the first and only storage vendor to provide native Hadoop integration with scale-out storage. Our solution is designed to deliver a number of key benefits:

  • Flexible storage infrastructure – EMC Isilon Scale-out NAS provides a highly scalable Hadoop platform that easily leverages other enterprise applications and workflows.
  • Highly resilient storage – Eliminates Hadoop “single-point-of-failure” issue by enabling all nodes in an EMC Isilon storage cluster to become NameNodes.
  • Unmatched efficiency – With Isilon solutions, customers can achieve up to eighty percent or more storage utilization, which reduces costs and simplifies management.
  • End-to-end data protection – The EMC Isilon solution provides reliable, end-to-end data protection including snapshoting for backup and recovery and data replication for disaster recovery capabilities (which are not available for Hadoop on DAS).
  • Massive scalability – EMC Isilon storage scales easily and as more storage capacity, performance increases linearly. Our solutions scale to over fifteen petabytes in a single cluster. Customers can scale their compute and storage resources independently.
  • Multi-protocol support – Our native Hadoop integration enables use of Hadoop data and analytics together with other enterprise applications and workflows while eliminating the need to manually move data around (as required with direct-attached storage implementations).
more
Close

Effective Information Management

Challenges

  • In-accessible or siloed analytics (“Cylinders of Excellence”)
  • Misaligned IT, Analytics, and Business Strategies
  • Complex analytic requirements and information assets beyond existing analytic capabilities

Objectives

  • Get accurate real-time reports
  • Identify future trends and incorporate in decision making
  • Access to data needed to make informed decisions
  • Incorporate external data such as Crowdsourcing and social media
  • Agility to support rapid combination of analytic and data products decreasing time to market for analytics
  • Increased understanding of data assets and value
more
Close

Expressive Analytics

Challenges

  • Time to market for analytics is too long hindering productivity
  • Untrusted analytic products or analytics that are not timely, accurate, or repeatable, nor tested
  • Inability to scale analytic generation largely due to a lack of training

Objectives

  • Make use of analytics in business processes and in developing new strategies
  • Offload mundane and routine decisions
  • Focus on analyzing results then focus on structuring data
  • Gain understanding of the value of the data
more

Visualization and Business Intelligence

Visualization and BI allows enterprise to put the power of analytics in the business user’s hands. It provides business friendly interfaces to do What-if Analysis, Dashboards and Reports, and Forecasting and Modeling.

Analytics

Analytics is the ability to do ad hoc analysis, modeling and simulation, data mining and pre-computation and aggregations on the underlying data. This step leverages the underlying technology stack to provide insight and ultimately value to the enterprise.

Transformations and Views

Transformations and Views allow for information extraction, data enrichment and consolidated views. This provides the ability to go against massive sized data sets and perform Natural Language Processing (NLP) capability such as text analytics and entity analytics.

Data Processing and Storage

Data processing and storage are integral for the big data environment. This level of the architecture provides the cutting edge software that is needed to handle big data and process it in an efficient manner. Key items in this part of the architecture include the distributed file systems (DFS) such as Hadoop HDFS. Also included are the NoSQL(Not Only SQL) databases such as HBase, Cassandra and Accumulo. These databases allow for the storage and retrieval of unstructured data. Hadoop’s Map Reduce framework allows data processing jobs to be spread over commodity hardware to allow for processing over multiple servers for speed and efficiency.

Security

For big data to be easily accessed, shared, and exposed systems must be able to efficiently and effectively control access to the data and guarantee its integrity. Access control to data at the data set, record, and cell level impose varying degrees of enforcement that allow the right information to be made available in a secure way.

Infrastructure

Infrastructure is the foundation of our reference architecture. It includes the hardware that is needed to properly stand up a big data environment. Key items included in the infrastructure are the following: network, disk, and processor configurations. Additional items in this stack include the System and Network security such as authentication, encryption in motion, and encryption at rest.

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

What's the Big Hadoop-la?

The big data space is as complex as it is large. The challenges of big data – volume, variety, velocity and the sheer number of vendors – creates a confusing dilemma for government agencies to select the best platform to meet mission needs. Similar to the cloud space, vendors are using big data as the single term that addresses many of the problems agencies may be experiencing. Agencies seek insight which their analysts use to advance their mission by converting raw data into actionable information and knowledge. Huge and growing amounts of raw data must be integrated and sliced and diced to support data-driven decision making.

Visit the Unisys Blog

A Roadmap for Building a Robust Data Analytics Environment

Organizations of all sizes rely on Hadoop implementations to derive business insights from their data. Our experience building large and complex data analytics for the Federal government and private companies shows that this “Haddop-la” effect is misguided and can potentially drive precious resources in the wrong direction

Visit the Unisys Blog

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

What's the Big Hadoop-la?

Rod Fontecilla – Posted 04/2/13

The big data space is as complex as it is large. The challenges of big data – volume, variety, velocity and the sheer number of vendors – creates a confusing dilemma for government agencies to select the best platform to meet mission needs. Similar to the cloud space, vendors are using big data as the single term that addresses many of the problems agencies may be experiencing. Agencies seek insight which their analysts use to advance their mission by converting raw data into actionable information and knowledge. Huge and growing amounts of raw data must be integrated and sliced and diced to support data-driven decision making. Fortunately, a new simplified method is helping agencies zero-in on the crowded big data space with a vendor neutral approach to picking the most flexible technology for their needs. Using a framework such as the following, agencies pinpoint what value should be extracted from existing data and which data will be most useful by focusing on:

A major financial agency, as one example, is experiencing an influx of data as its mission adapts to the increased data processing needs of the expanding mass of petabytes of data following the ongoing economic crisis. Analyzing economic indicators forecasting sensitive data trends, the agency is trying to identify new data management technology to sift through data from a host of new sources, including hedge funds and mortgage institutions. Unfortunately the agency’s current architecture does not support the slicing and dicing of data that their analysts need now. By using a mission-driven approach that focuses on the analytics and the data, the agency will get through the analysis paralysis that can...

Leave a Reply
Share:Facebook1Twitter3Google+0LinkedIn4Pinterest0StumbleUpon0Email

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

A Roadmap for Building a Robust Data Analytics Environment

Rod Fontecilla - Posted 06/26/13

Organizations of all sizes rely on Hadoop implementations to derive business insights from their data. Our experience building large and complex data analytics for the Federal government and private companies shows that this “Haddop-la” effect is misguided and can potentially drive precious resources in the wrong direction.

Below are nine steps to build a flexible, scalable and reliable data analytics environment, tightly integrated to your existing data architecture. These steps ensure a roadmap for data analytics and provide critical business insights to drive efficiencies in any agency.

  1. Work with business leaders to understand and quantify your data value chain – To extract vital insight from your data, it is essential to work with stakeholders and understand the business value associated with it. The implementation of an encompassing enterprise data management strategy is required to achieve this.
  2. View data as an enterprise asset – No longer is technology the driving force in an IT environment. Information – the “I” – now drives future business decisions, not the technology. As data becomes a fundamental asset of your enterprise, agencies need to extract the most pertinent information out of existing data to create new value in their enterprise.
  3. Innovate through the creation of new data products and services – Convert data to information through the creation of data products. Data products provide users with the ultimate tool to improve current business processes, identify hidden efficiencies, and create new revenue channels.
  4. Retrain staff and/or acquire data scientist skills – In order to create powerful data products, data scientists – not just the typical IT staff – are necessary to ingest, analyze and predict the future. Data scientists are also versed in data model creation and pattern recognition.
  5. Integrate teams across big data, data warehousing, and business analysis – Data scientists need to integrate into your existing data environment by executing your data warehousing and business intelligence strategies.
  6. Revise information management strategies to incorporate big data – Take a step back here to reformulate your information management strategy to encompass new tools and technology that will help drive the analytic process.
  7. Develop new ways of capturing information e.g., mobile and streaming data – Providing a flexible and scalable environment to support mobile devices and the streaming of data from multiple sensors is critical. The “Internet of All Things” creates a massive amount of data and great data stream opportunities for your agency.
  8. Identify and leverage previously unused internal and external data – Data that has not been utilized for various reasons should be brought up and linked to new data sets. As the social matrix continues expanding, data obtained will help accelerate knowledge sharing and faster response times.
  9. Automating Knowledge Work – Use learning algorithms to search for essential information and find patterns of meaning at superhuman speeds. Retraining your work force to effectively use critical new pieces of data is also an essential step in maximizing a stellar data analytics environment.
Leave a Reply
Share:Facebook15Twitter1Google+0LinkedIn1Pinterest0StumbleUpon0Email

Contact Us

We are interested in hearing about your Big Data challenges and look forward to the opportunity to learn more to see if our solutions can help address your unique challenges. Please submit your information and a representative will contact you as soon as possible.

To address this Big Data challenge, organizations need a highly scalable infrastructure that supports rapidly growing data storage requirements efficiently. And, to unlock the value of their data assets, many leading organizations are deploying powerful new analytics tools that help them gain new insight that can highly benefit the organization.

Menu
Unisys-EMC
Contact

Thank you for your submission. We will contact you shortly.