data consolidation

Results 1 - 25 of 118Sort Results By: Published Date | Title | Company Name
Published By: ServiceNow     Published Date: May 14, 2019
In this report Gartner provides recommendations to address the key challenges I&O leaders are facing such as the inability to identify and remediate issues quickly, lack of insights to connect increasing volumes of data, and executive level sponsorship in tool replacement and consolidation.
Tags : 
    
ServiceNow
Published By: BMC ASEAN     Published Date: Dec 18, 2018
Big data projects often entail moving data between multiple cloud and legacy on-premise environments. A typical scenario involves moving data from a cloud-based source to a cloud-based normalization application, to an on-premise system for consolidation with other data, and then through various cloud and on-premise applications that analyze the data. Processing and analysis turn the disparate data into business insights delivered though dashboards, reports, and data warehouses - often using cloud-based apps. The workflows that take data from ingestion to delivery are highly complex and have numerous dependencies along the way. Speed, reliability, and scalability are crucial. So, although data scientists and engineers may do things manually during proof of concept, manual processes don't scale.
Tags : 
    
BMC ASEAN
Published By: Group M_IBM Q418     Published Date: Sep 10, 2018
LinuxONE from IBM is an example of a secure data-serving infrastructure platform that is designed to meet the requirements of current-gen as well as next-gen apps. IBM LinuxONE is ideal for firms that want the following: ? Extreme security: Firms that put data privacy and regulatory concerns at the top of their requirements list will find that LinuxONE comes built in with best-in-class security features such as EAL5+ isolation, crypto key protection, and a Secure Service Container framework. ? Uncompromised data-serving capabilities: LinuxONE is designed for structured and unstructured data consolidation and optimized for running modern relational and nonrelational databases. Firms can gain deep and timely insights from a "single source of truth." ? Unique balanced system architecture: The nondegrading performance and scaling capabilities of LinuxONE — thanks to a unique shared memory and vertical scale architecture — make it suitable for workloads such as databases and systems of reco
Tags : 
    
Group M_IBM Q418
Published By: Dell EMC & Intel     Published Date: Sep 06, 2018
Datacenter improvements have thus far focused on cost reduction and point solutions. Server consolidation, cloud computing, virtualization, and the implementation of flash storage capabilities have all helped reduce server sprawl, along with associated staffing and facilities costs. Converged systems — which combine compute, storage, and networking into a single system — are particularly effective in enabling organizations to reduce operational and staff expenses. These software-defined systems require only limited human intervention. Code imbedded in the software configures hardware and automates many previously manual processes, thereby dramatically reducing instances of human error. Concurrently, these technologies have enabled businesses to make incremental improvements to customer engagement and service delivery processes and strategies.
Tags : 
    
Dell EMC & Intel
Published By: Dell EMC & Intel     Published Date: Sep 06, 2018
Jusqu’à présent, les améliorations du datacenter se sont limitées à la réduction des coûts et à des solutions ponctuelles. La consolidation des serveurs, le Cloud computing, la virtualisation et l’implémentation de stockage Flash ont contribué à réduire la prolifération des serveurs, ainsi que les coûts de personnel et d’installations associés. Regroupant ressources de calcul, de stockage et de réseau au sein d’une même solution, les systèmes convergés se révèlent particulièrement efficaces dans la baisse des dépenses de personnel et de fonctionnement. Ces systèmes définis par logiciel (software-defined) exigent peu d’interventions humaines. Le code intégré dans le logiciel configure le matériel et automatise de nombreux processus autrefois manuels, ce qui réduit considérablement le risque d’erreurs humaines. Ensemble, ces technologies ont permis aux entreprises d’améliorer progressivement les processus et stratégies d’engagement client et de prestation de services.
Tags : 
    
Dell EMC & Intel
Published By: Pure Storage     Published Date: Sep 04, 2018
Veritas' NetBackup software has long been a favorite for data protection in the enterprise, and is now fully integrated with the market-leading all-flash data storage platform: Pure Storage. NetBackup leverages the FlashArray API for fast and simple snapshot management, and protection copies can be stored on FlashBlade for rapid restores and consolidation of file and object storage tiers. This webinar features architecture overviews as well as 2 live demo's on the aforementioned integration points.
Tags : 
    
Pure Storage
Published By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: BlackLine     Published Date: Aug 06, 2018
The biotechnology and pharmaceutical industry is among the most heavily regulated industries in the world, challenged by evolving regulations, complex compliance requirements and close regulatory scrutiny. At the same time, companies must address the market pressures of globalization, the use of predictive data analytics and digital technologies, and the industry’s ongoing consolidation. In this challenging environment, confidence in internal controls is crucial.
Tags : 
    
BlackLine
Published By: Hewlett Packard Enterprise     Published Date: Jul 18, 2018
How hyperconverged infrastructure can reduce costs and help align enterprise IT with business needs. Includes chapters on hyperconvergence and cloud, datacenter consolidation, ROBO deployment, test and development environments, and disaster recovery.
Tags : 
    
Hewlett Packard Enterprise
Published By: IBM     Published Date: Jun 29, 2018
LinuxONE from IBM is an example of a secure data-serving infrastructure platform that is designed to meet the requirements of current-gen as well as next-gen apps. IBM LinuxONE is ideal for firms that want the following: ? Extreme security: Firms that put data privacy and regulatory concerns at the top of their requirements list will find that LinuxONE comes built in with best-in-class security features such as EAL5+ isolation, crypto key protection, and a Secure Service Container framework. ? Uncompromised data-serving capabilities: LinuxONE is designed for structured and unstructured data consolidation and optimized for running modern relational and nonrelational databases. Firms can gain deep and timely insights from a "single source of truth." ? Unique balanced system architecture: The nondegrading performance and scaling capabilities of LinuxONE — thanks to a unique shared memory and vertical scale architecture — make it suitable for workloads such as databases and systems of reco
Tags : 
    
IBM
Published By: Oracle     Published Date: Mar 22, 2018
s your information technology (IT) organization pressured to get more work done with fewer people or on a constricted budget? Do you need to make IT a competitive asset rather than a cost center? Does your business struggle with slow software applications or data that's too often unavailable? If you answered "yes" to any of these questions, it's time to take a close look at Oracle Exadata, the world's fastest database machine exclusively designed to run Oracle Database. It is the first database machine optimized for data warehousing, online transaction processing (OLTP), and database consolidation workloads as well as in-memory databases and database as a service (DBaaS).
Tags : 
    
Oracle
Published By: SAS     Published Date: Mar 06, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics, and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. With the right end-user tools, a data lake can enable the self-service data practices that both technical and business users need. These practices wring business value from big data, other new data sources, and burgeoning enterprise da
Tags : 
    
SAS
Published By: SAS     Published Date: Oct 18, 2017
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: Schneider Electric     Published Date: Aug 15, 2017
The reference guide lays out for data center managers a step-by step approach to data center consolidation. By breaking down the process into clear and identifiable actions – all of which are covered in the sections below – a data center consolidation becomes much more manageable, and the odds of its success much higher.
Tags : 
assessment, strategical planning, design, commission, operation, additional resources, schneider electric
    
Schneider Electric
Published By: IBM     Published Date: Jul 26, 2017
To compete in today’s fast-paced business climate, enterprises need accurate and frequent sales and customer reports to make real-time operational decisions about pricing, merchandising and inventory management. They also require greater agility to respond to business events as they happen, and more visibility into business activities so information and systems are optimized for peak efficiency and performance. By making use of data capture and business intelligence to integrate and apply data across the enterprise, organizations can capitalize on emerging opportunities and build a competitive advantage. The IBM® data replication portfolio is designed to address these issues through a highly flexible one-stop shop for high-volume, robust, secure information replication across heterogeneous data stores. The portfolio leverages real-time data replication to support high availability, database migration, application consolidation, dynamic warehousing, master data management (MDM), service
Tags : 
ibm, infosphere, data replication, security, data storage
    
IBM
Published By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
A hardware and software system specifically designed for the database software to optimize database operations, both for performance and administrative simplicity. An environment that supports workload consolidation, thereby reducing the number of physical servers required for the databases in question. Benefits: Reduced cost and optimal performance.
Tags : 
    
Oracle PaaS/IaaS/Hardware
    
Ciena
Published By: Dell EMC     Published Date: Feb 23, 2017
Want to join your peers in database storage nirvana? Learn how many organizations have benefited from the Dell EMC All-Flash portfolio, powered by Intel® Xeon® processors—and how it can help you can solve the most common database storage challenges: Time-to-market, consolidation, and complexity.
Tags : 
    
Dell EMC
Published By: Fujitsu     Published Date: Feb 06, 2017
Data center infrastructure complexity must be tamed, as mobility, cloud networking and social media demand fast and agile approaches to data delivery. You can overcome these obstacles and improve your data center operations by consolidating your systems and deploying virtualization, using the Fujitsu PRIMEFLEX vShape reference architecture. Get the e-Book.
Tags : 
it infrastructure, data consolidation, data virtualization, converged infrastructure, primeflex vshape
    
Fujitsu
Published By: Ciena     Published Date: Nov 15, 2016
"In healthcare, as the trends supporting eHealth accelerate, the need for scalable, reliable, and secure network infrastructures will only grow. This white paper describes the key factors and technologies to consider when building a private network for healthcare sector enterprises, including: Transport Network Equipment Outside Fiber Plant Converged Platforms Reliability, Redundancy, and Protection Reconfigurable Networks Management Software Security Services, Operation, Program Management, and Maintenance Download our white paper to learn more."
Tags : 
packet networking, packet networking portfolio, packet optical hie network, packet-optical transport, patient data security, private and hybrid cloud, private optical network for healthcare, private optical network for hospitals, rapidly growing bandwidth, real-time telesurgery video and data, remote imaging and diagnosis, simplify healthcare network operations, telehealth and telemedicine, telesurgery and video diagnosis, transfer massive imaging files, transmitting full-body mri images, transmitting genome data, transport network equipment, value-based care models, wan and lan convergence reduces
    
Ciena
    
Ciena
    
Ciena
Published By: Dell EMC     Published Date: Nov 08, 2016
Time-to-market, consolidation, and complexity struggles are a thing of the past. Join yourpeers in database storage nirvana with the Dell EMC All-Flash portfolio, powered by Intel® Xeon® processors.
Tags : 
database, consolidation, capacity, storage, complexity
    
Dell EMC
Published By: Commvault     Published Date: Jul 06, 2016
It’s no secret that today’s unprecedented data growth, data center consolidation and server virtualization are wreaking havoc with conventional approaches to backup and recovery. Here are five strategies for modern data protection that will not only help solve your current data management challenges but also ensure that you’re poised to meet future demands.
Tags : 
commvault, data growth, datacenter consolidation, server virtualization, backup, recovery, data management, scalability
    
Commvault
Published By: CyrusOne     Published Date: Jul 05, 2016
Data centers help state and federal agencies reduce costs and improve operations. Every day, government agencies struggle to meet critical cost controls with lower operational expenses while fulfilling the Federal Data Center Consolidation Initiative’s (FDCCI) goal. All too often they are finding themselves constrained by their legacy in-house data centers and connectivity solutions that fail to deliver exceptional data center reliability and uptime.
Tags : 
data center, best practices, competitive advantage, productivity
    
CyrusOne
Start   Previous   1 2 3 4 5    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.