data volume

Results 51 - 75 of 199Sort Results By: Published Date | Title | Company Name
Published By: MarkLogic     Published Date: Mar 29, 2018
Real World Evidence (RWE) requires the correlation of complex, frequently changing, unstructured data. To the enterprise architect, that means extracting value from data that doesn't neatly fit solutions. In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that is required to be successful with RWE. It is for this reason that leading life science organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic.
Tags : 
data, integration, volume, optimization, architect, enterprise
    
MarkLogic
Published By: Hewlett Packard Enterprise     Published Date: Mar 26, 2018
Modern storage arrays can’t compete on price without a range of data reduction technologies that help reduce the overall total cost of ownership of external storage. Unfortunately, there is no one single data reduction technology that fits all data types and we see savings being made with both data deduplication and compression, depending on the workload. Typically, OLTP-type data (databases) work well with compression and can achieve between 2:1 and 3:1 reduction, depending on the data itself. Deduplication works well with large volumes of repeated data like virtual machines or virtual desktops, where many instances or images are based off a similar “gold” master.
Tags : 
    
Hewlett Packard Enterprise
Published By: AWS     Published Date: Mar 22, 2018
Healthcare and Life Sciences organizations are adopting cloud-based workloads at a significant pace. A 2017 HIMSS study found that 65% of Healthcare organizations were using cloud-based services, and nearly 88% of those organizations were utilizing Software-as-a-Service (SaaS) solutions, which have become the preferred deployment method for many clinical application vendors. This eBook highlights advantages of using AWS to create and maintain cloudbased Next-Gen BI Solutions for Healthcare and Life Sciences organizations. This includes use cases from diverse organizations that have utilized AWS and APN Partners to manage and analyze data, and to discover insights otherwise obscured by the sheer volume of available information. Solutions from APN Partners can help your organization take the next step in building robust processes for making data-driven decisions that improve patient care, organizational processes, and innovative product development efforts.
Tags : 
    
AWS
Published By: IBM APAC     Published Date: Mar 19, 2018
Unstructured data has exploded in volume over the past decade. Unstructured data, media files and other data can be created just about anywhere on the planet using almost any smart device available today. As the amount of unstructured data grows exponentially, customers using this data need to be able to take advantage of the right storage solutions to support all of their file and object data requirements. IBM® recently added a new storage system to their Spectrum product family, IBM Spectrum Network Attached Storage (NAS). IBM Spectrum NAS adds another software-defined file storage system to IBM’s current unstructured data storage solutions, IBM Spectrum Scale™ and IBM Cloud Object Storage (COS). Below, we will discuss the three systems and supply some guidance on when and where to use each of them.
Tags : 
    
IBM APAC
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Compression algorithms reduce the number of bits needed to represent a set of data—the higher the compression ratio, the more space this particular data reduction technique saves. During our OLTP test, the Unity array achieved a compression ratio of 3.2-to-1 on the database volumes, whereas the 3PAR array averaged a 1.3-to-1 ratio. In our data mart loading test, the 3PAR achieved a ratio of 1.4-to-1 on the database volumes, whereas the Unity array got 1.3 to 1.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Les algorithmes de compression réduisent le nombre de bits nécessaires pour représenter un ensemble de données. Plus le taux de compression est élevé, plus cette technique de réduction des données permet d’économiser de l’espace. Lors de notre test OLTP, la baie Unity a atteint un taux de compression de 3,2 pour 1 sur les volumes de base de données. De son côté, la baie 3PAR affichait en moyenne un taux de 1,3 pour 1. Sur le test de chargement DataMart, la baie 3PAR a atteint un taux de 1,4 pour 1 sur les volumes de bases de données, tandis que la baie Unity enregistrait un taux de 1,3 pour 1.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
Komprimierungsalgorithmen sorgen dafür, dass weniger Bit benötigt werden, um einen bestimmten Datensatz zu repräsentieren. Je höher das Komprimierungsverhältnis, desto mehr Speicherplatz wird durch dieses spezielle Datenreduzierungsverfahren eingespart. Während unseres OLTP-Tests erreichte das Unity-Array bei den Datenbank-Volumes ein Komprimierungsverhältnis von 3,2:1, während das 3PAR-Array im Schnitt nur ein Verhältnis von 1,3:1 erreichte. In unserem Data Mart-Ladetest erzielte das 3PAR bei den Datenbank-Volumes ein Verhältnis von 1,4:1, das Unity-Array nur 1,3:1.
Tags : 
    
Dell PC Lifecycle
Published By: LogMeIn     Published Date: Feb 27, 2018
24/7 Self-Service Support Center: Bold360 ai’s 24/7 context driven support center was implemented, allowing users to instantly discover relevant content from the smart knowledge database. Dynamic FAQs displayed trending topics in real-time to speed up customer resolution and discoverability. Real-Time Customer Analytics highlight unanswered questions, giving Premium Credit instant visibility of missing topics, questions driving ticket volume, and more.
Tags : 
customer, support, faq, credit
    
LogMeIn
Published By: MobileIron     Published Date: Feb 26, 2018
Enterprises are increasingly expected to support Macs as corporateapproved devices. In order to be in compliance, it is imperative that all devices accessing sensitive corporate and customer data be fully secured and managed. MobileIron delivers a new model for authentication and identity to Macs and enables enterprises to unify Apple mobile and desktop operations using a common security and management platform. MobileIron’s layered security can be extended to not only corporate-owned Macs, but to employee-owned devices as well. And, IT organizations can bring Macs under management across the organization with unparalleled speed and at scale thanks to seamless integration with Apple’s Device Enrollment Program (DEP) and Volume Purchase Program (VPP).
Tags : 
    
MobileIron
Published By: Cloudian     Published Date: Feb 21, 2018
We are critically aware of the growth in stored data volumes putting pressure on IT budgets and services delivery. Burgeoning volumes of unstructured data commonly drive this ongoing trend. However, growth in database data can be expected as well as enterprises capture and analyze data from the myriad of wireless devices that are now being connected to the Internet. As a result, stored data growth will accelerate. Object-based storage systems are now available that demonstrate these characteristics. While they have a diverse set of use cases, we see several vendors now positioning them as on-premises targets for backups. In addition, integration of object-based data protection storage with cloud storage resources is seen by these vendors as a key enabler of performance at scale, cost savings, and administrative efficiency.
Tags : 
    
Cloudian
Published By: Cloudian     Published Date: Feb 20, 2018
We are critically aware of the growth in stored data volumes putting pressure on IT budgets and services delivery. Burgeoning volumes of unstructured data commonly drive this ongoing trend. However, growth in database data can be expected as well as enterprises capture and analyze data from the myriad of wireless devices that are now being connected to the Internet. As a result, stored data growth will accelerate.
Tags : 
    
Cloudian
Published By: Cloudian     Published Date: Feb 15, 2018
We are critically aware of the growth in stored data volumes putting pressure on IT budgets and services delivery. Burgeoning volumes of unstructured data commonly drive this ongoing trend. However, growth in database data can be expected as well as enterprises capture and analyze data from the myriad of wireless devices that are now being connected to the Internet. As a result, stored data growth will accelerate. Object-based storage systems are now available that demonstrate these characteristics. While they have a diverse set of use cases, we see several vendors now positioning them as on-premises targets for backups. In addition, integration of object-based data protection storage with cloud storage resources is seen by these vendors as a key enabler of performance at scale, cost savings, and administrative efficiency.
Tags : 
    
Cloudian
Published By: Group M_IBM Q1'18     Published Date: Jan 23, 2018
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
cloud applications, database, data volume, data availability
    
Group M_IBM Q1'18
Published By: Gemini Data     Published Date: Jan 16, 2018
The increasing reliance on big data platforms for all functions of the organization has been transformative. As these environments mature and data volumes increase, organizations face infrastructure and management scalability challenges. Gemini Enterprise Manager simplifies deployment and management with a turnkey, NoOps appliance, providing simplicity, security, and speed to accelerate the time to value for any analysis use case. Manager allows you to control your Splunk deployment as a single, unified solution deployed on premises, in the cloud or both.
Tags : 
    
Gemini Data
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: Group M_IBM Q1'18     Published Date: Jan 08, 2018
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. This e-book presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : 
database, streamlining, it infrastructure, database systems
    
Group M_IBM Q1'18
Published By: Carbonite     Published Date: Jan 04, 2018
Malware that encrypts a victim’s data until the extortionist’s demands are met is one of the most common forms of cybercrime. And the prevalence of ransomware attacks continues to increase. Cybercriminals are now using more than 50 different forms of ransomware to target and extort money from unsuspecting individuals and businesses. Ransomware attacks are pervasive. More than 4,000 ransomware attacks happen every day, and the volume of attacks is increasing at a rate of 300 percent annually. According to an IDT911 study, 84 percent of small and midsize businesses will not meet or report ransomware demands. No one is safe from ransomware, as it attacks enterprises and SMBs, government agencies, and individuals indiscriminately. While ransomware demands more than doubled in 2016 to $679 from $294 in 2015, the cost of remediating the damage and lost productivity is many multiples higher.
Tags : 
    
Carbonite
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. This e-book presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : 
database, streamlining, it infrastructure, database systems
    
Group M_IBM Q1'18
Published By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
    
Oracle
Published By: NetApp     Published Date: Nov 13, 2017
To help member hospitals and clinics easily, securely, and affordably protect expanding data volumes, Engage switched to NetApp® AltaVault backup.
Tags : 
    
NetApp
Published By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
ibm, cloud, cloud computing, database, ibm db2
    
IBM
Published By: IBM     Published Date: Nov 08, 2017
Flexible deployment options, licensing models help take the challenges out of change. As you move toward the cloud, you're likely planning or managing a mixed environment of on- premises and on- cloud applications. To help you succeed in this transition, you need a trans-formative, mixed-workload database that can handle a massive volume of data while delivering high performance, data availability and the flexibility to adapt respond to business changes.
Tags : 
ibm db2, cloud, on-cloud applications, mixed-workload database
    
IBM
Published By: MarkLogic     Published Date: Nov 07, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic.
Tags : 
    
MarkLogic
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Start   Previous    1 2 3 4 5 6 7 8    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.