data volume

Results 76 - 100 of 200Sort Results By: Published Date | Title | Company Name
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Modern storage arrays can’t compete on price without a range of data reduction technologies that help reduce the overall total cost of ownership of external storage. Unfortunately, there is no one single data reduction technology that fits all data types and we see savings being made with both data deduplication and compression, depending on the workload. Typically, OLTP-type data (databases) work well with compression and can achieve between 2:1 and 3:1 reduction, depending on the data itself. Deduplication works well with large volumes of repeated data like virtual machines or virtual desktops, where many instances or images are based off a similar “gold” master.
Tags : 
    
Hewlett Packard Enterprise
Published By: Hewlett Packard Enterprise     Published Date: Jul 19, 2018
The next wave of cloud storage innovation is upon us. It’s called multicloud. With multicloud storage you can combine cloud simplicity with enterprise-grade reliability, provide data mobility among multiple cloud types, and eliminate vendor lock-in. And it’s available right now through the Nimble Cloud Volumes service.
Tags : 
cloud, storage, flash
    
Hewlett Packard Enterprise
Published By: HP     Published Date: May 14, 2014
In today’s world of explosive data volumes, midmarket companies face a growing number of challenges when it comes to safeguarding data, and today's approaches to backup and data protection are falling short. The Brief outlines HP solutions to meet IT needs for backup appliances that tightly integrate deduplication, data protection and recovery.
Tags : 
hewlett packard, data, backup, . data protections, recovery, deduplication
    
HP
Published By: HP     Published Date: May 14, 2014
Your data is everywhere. It’s in the cloud, in virtual environments, in remote offices, and on mobile devices. Plus, there’s more of it, and much of it is business-critical, meaning you need to back up larger volumes of data than ever before without adding costs or bandwidth. Even more importantly, you need the ability to restore data quickly in the event of a disaster, failure, outage. The cost of data protection is higher now more than ever.
Tags : 
hewlett packard, cloud, mobile, remote, data recovery, disaster, failure
    
HP
Published By: HP     Published Date: Feb 02, 2015
Ever-increasing data volumes driven by the constant growth in both structured and unstructured data coupled with the ever decreasing costs of storage capacity on a per GB basis are continuing to put a strain on corporate backup abilities. While other backup and data optimization technologies offer some relief, deduplicating backup appliances have become the go to solution. They provide a quick, largely non-disruptive plug-and-play solution that alleviates backup pain, reduces storage consumption by up to 20x and have become a proven frontrunner in the ongoing battle to improve the backup experience.
Tags : 
    
HP
Published By: HP     Published Date: Feb 11, 2015
Ever-increasing data volumes driven by the constant growth in both structured and unstructured data coupled with the ever decreasing costs of storage capacity on a per GB basis are continuing to put a strain on corporate backup abilities. While other backup and data optimization technologies offer some relief, deduplicating backup appliances have become the go to solution. They provide a quick, largely non-disruptive plug-and-play solution that alleviates backup pain, reduces storage consumption by up to 20x and have become a proven frontrunner in the ongoing battle to improve the backup experience.
Tags : 
    
HP
Published By: HP - Enterprise     Published Date: Jun 04, 2013
Businesses are overwhelmed with data; it’s a blessing and a curse. A curse because it can overwhelm traditional approaches to storing and processing it. A blessing because the data promises business insight that never existed earlier. The industry has spawned a new term, “big data,” to describe it. Now, IT itself is overwhelmed with its own big data. In the press to roll out new services and technologies—mobility, cloud, virtualization—applications, networks, and physical and virtual servers grow in a sprawl. With them comes an unprecedented volume of data such as logs, events, and flows. It takes too much time and resources to sift through it, so most of it lies unexplored and unexploited. Yet like business data, it contains insight that can help us solve problems, make decisions, and plan for the future.
Tags : 
data research, big data, virtualization, applications, networks
    
HP - Enterprise
Published By: HP Inc.     Published Date: Feb 14, 2019
How to overcome complexity and be ready for future growth. The modern production digital print market brought a world of opportunities for micro-runs and variable data printing into the hands of Print Service Providers (PSPs). With print available in any quantity to anyone, print is being democratized, resulting in ever-greater numbers of increasingly smaller jobs. The old print model of high-volume static print is being replaced with a myriad of micro-runs and micro-jobs. Automation that allows PSP owners to master the ensuing complexity and respond faster than ever before is enabling them to refocus on market development and on delivering value.
Tags : 
    
HP Inc.
Published By: HP Inc.     Published Date: Jun 20, 2019
Four billion people now generate four quintillion bytes of data every day - and with the number of IoT devices set to increase to three times the global population by 2022 - volumes will only continue to rise. The challenge is processing the data. This is why machine learning, deep learning and all the other developing forms of AI must deliver the analytics toolset businesses need to compete.
Tags : 
    
HP Inc.
Published By: IBM     Published Date: May 30, 2008
Not all data is created equal, however, and deriving business value from information dispersed over many different sources means integrating, analyzing and optimizing heterogeneous types and sources of information throughout its lifecycle. Find out how, when deployed together in the same environment, IBM TotalStorage Productivity Center, IBM System Storage SAN Volume Controller and IBM Tivoli Storage Manager deliver the core technologies need to help customers tier their storage based on classes of service.
Tags : 
information lifecycle, information management, tivoli, storage management, leveraging information, ibm, ibm li, li campaign
    
IBM
Published By: IBM     Published Date: Dec 03, 2008
As your data volume and value increases, reduce your risk with the IBM System Storage DS3400 Express. This direct-attach storage (DAS) or SAN solution is scalable and expandable to consolidate your data. It’s also flexible and affordable for small and midsize businesses. Get all the details at a glance in this data sheet.
Tags : 
ibm, express seller, system storage, ds3400, san solution, ds3400 express
    
IBM
Published By: IBM     Published Date: Sep 10, 2009
With ever-increasing data growth worldwide, organizations must find smarter ways to store and manage their massive volumes of data. Learn how IBM® Tivoli® Storage Management software solutions help to maximize your current storage environment and reduce operational and capital costs while improving service and managing risks.
Tags : 
storage management, ibm, ibmtivoli, fastback, san, data growth, recovery management, reducing costs, improving service, managing risks, dynamic storage, infrastructure, virtualization, resource management
    
IBM
Published By: IBM     Published Date: Jul 05, 2012
Esri, a U.S. software company, had growing data volumes as they added new applications. System response slowed down and their existing infrastructure had no room for extension. Read the white paper and see how with the IBM Migration Factory helped Esri consolidate 16 physical servers to two, automate storage management and drastically cut costs.
Tags : 
ibm, technology, sap, sofrware, technology, migration factory, storage
    
IBM
Published By: IBM     Published Date: Sep 06, 2013
In this IBM security report, we will take a look at the data we've gathered through our monitoring operations and the security intelligence generated by our analysts and incident response teams who interpret that data. Our aim is to help you gain important insights into the current threat landscape - with a close look at the volume of attacks, the industries most impacted , the most prevalent types of attacks and attackers, and the key factors enabling them.
Tags : 
ibm, security services, cyber security intelligence, index, security report, security attacks, security protection, business security, data security, security intelligence index
    
IBM
Published By: IBM     Published Date: Dec 06, 2013
Partners and customers expect instantaneous response and continuous uptime from data systems, but the volume and velocity of data make it difficult to respond with agility. IBM PureData System for Transactions enable businesses to gear up and meet these challenges.
Tags : 
ibm, ibm puredata system, data, data mangement, database, integration, transactions, workload, availability, data system
    
IBM
Published By: IBM     Published Date: Jan 02, 2014
This study highlights the phases of the big data journey, the objectives and challenges of midsize organizations taking the journey, and the current state of the technology that they are using to drive results. It also offers a pragmatic course of action for midsize companies to take as they dive into this new era of computing.
Tags : 
ibm, analytics, global business service, big data, business value, it professionals, volume, velocity, variety, customer analytics, trends and insights, information management, data security, integration, variety of data, analytic accelerator, infrastructure
    
IBM
Published By: IBM     Published Date: Jan 09, 2014
Simply put, software defined storage is the abstraction of storage services from storage hardware. This term is more than just marketing hype, it’s the logical evolution of storage virtualization from simply being a storage aggregator to the end goal of storage as a service. To achieve this goal, software defined storage needs a platform from which to centralize.
Tags : 
ibm, storage switzerland, storage platform, sds, software defined storage, data center, storage virtualization, data protection, shared storage, storage capacity, centric systems, high performance databases, storage efficiency, range of solutions, vendor options, volume management, software selection, data service, data management, performance loss
    
IBM
Published By: IBM     Published Date: Mar 05, 2014
For many years, companies have been building data warehouses to analyze business activity and produce insights for decision makers to act on to improve business performance. These traditional analytical systems are often based on a classic pattern where data from multiple operational systems is captured, cleaned, transformed and integrated before loading it into a data warehouse. Typically, a history of business activity is built up over a number of years allowing organizations to use business intelligence (BI) tools to analyze, compare and report on business performance over time. In addition, subsets of this data are often extracted from data warehouses into data marts that have been optimized for more detailed multi-dimensional analysis.
Tags : 
ibm, big data, data, big data platform, analytics, data sources, data complexity, data volume, data generation, data management, storage, acceleration, business intelligence, data warehouse
    
IBM
Published By: IBM     Published Date: May 27, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media
Tags : 
ibm, big data, analytics, insurance, insurance industry, big data solutions, integration, risk assessment, policy rates, customer retention, claims data, transaction data
    
IBM
Published By: IBM     Published Date: May 28, 2014
Different types of data have different data retention requirements. In establishing information governance and database archiving policies, take a holistic approach by understanding where the data exists, classifying the data, and archiving the data. IBM InfoSphere Optim™ Archive solution can help enterprises manage and support data retention policies by archiving historical data and storing that data in its original business context, all while controlling growing data volumes and improving application performance. This approach helps support long-term data retention by archiving data in a way that allows it to be accessed independently of the original application.
Tags : 
ibm, data retention, information governance, archiving, historical data, integrating big data, governing big data, integration, best practices, big data, ibm infosphere, it agility, performance requirements, hadoop, scalability, data integration, big data projects, high-quality data, leverage data replication, data persistence
    
IBM
Published By: IBM     Published Date: Jun 20, 2014
Data volumes are getting out of control, but choosing the right ILG solution can be a huge challenge, with multiple stakeholders, numerous business processes, and extensive solution requirements. Use this requirements kit from CGOC to find the tools and technology you need.
Tags : 
ibm, ilg, information lifecycle governance, information management, ilg solutions, data storage, data management
    
IBM
Published By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : 
ibm, data, big data, information, healthcare, governance, technology
    
IBM
Published By: IBM     Published Date: Aug 06, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media
Tags : 
ibm, insurance, data, big data, analytics, solutions
    
IBM
Published By: IBM     Published Date: Aug 08, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media.
Tags : 
big data, analytics, insurance, customer service, solutions
    
IBM
Start   Previous    1 2 3 4 5 6 7 8    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.