data volume

Results 151 - 175 of 194Sort Results By: Published Date | Title | Company Name
Published By: IBM     Published Date: Jul 19, 2016
Data movement and management is a major pain point for organizations operating HPC environments. Whether you are deploying a single cluster, or managing a diverse research facility, you should be taking a data centric approach. As data volumes grow and the cost of compute drops, managing data consumes more of the HPC budget and computational time. The need for Data Centric HPC architectures grows dramatically as research teams pool their resources to purchase more resources and improve overall utilization. Learn more in this white paper about the key considerations when expanding from traditional compute-centric to data-centric HPC.
Tags : 
ibm, analytics, hpc, big data
    
IBM
Published By: Vision Solutions     Published Date: Jun 10, 2009
Journaling? RAID? Vaulting? Mirroring? High availability? Know your data protection and recovery options! Download this information-packed 29-page report that reviews the spectrum of IBM i (i5/OS) and AIX resilience and recovery technologies and best practices choices, including the latest, next-generation solutions.
Tags : 
aix, i5/os, vision, resilience, recovery, recovery solutions, os/400, ibm i, next generation solutions, recovery time, hacmp, high availability, clustering, environmental conditions, operator error, software bugs, data loss, 99.99, failure, logical volume manager
    
Vision Solutions
Published By: Vision Solutions     Published Date: Jun 10, 2009
For IT departments looking to bring their AIX environments up to the next step in data protection, IBM’s PowerHA (HACMP) connects multiple servers to shared storage via clustering. This offers automatic recovery of applications and system resources if a failure occurs with the primary server.
Tags : 
aix, i5/os, vision, resilience, recovery, recovery solutions, os/400, ibm i, next generation solutions, recovery time, hacmp, high availability, clustering, environmental conditions, operator error, software bugs, data loss, 99.99, failure, logical volume manager
    
Vision Solutions
Published By: SAS     Published Date: Jan 04, 2019
As the pace of business continues to accelerate, forward-looking organizations are beginning to realize that it is not enough to analyze their data; they must also take action on it. To do this, more businesses are beginning to systematically operationalize their analytics as part of a business process. Operationalizing and embedding analytics is about integrating actionable insights into systems and business processes used to make decisions. These systems might be automated or provide manual, actionable insights. Analytics are currently being embedded into dashboards, applications, devices, systems, and databases. Examples run from simple to complex and organizations are at different stages of operational deployment. Newer examples of operational analytics include support for logistics, customer call centers, fraud detection, and recommendation engines to name just a few. Embedding analytics is certainly not new but has been gaining more attention recently as data volumes and the freq
Tags : 
    
SAS
Published By: IBM     Published Date: Jan 02, 2014
This study highlights the phases of the big data journey, the objectives and challenges of midsize organizations taking the journey, and the current state of the technology that they are using to drive results. It also offers a pragmatic course of action for midsize companies to take as they dive into this new era of computing.
Tags : 
ibm, analytics, global business service, big data, business value, it professionals, volume, velocity, variety, customer analytics, trends and insights, information management, data security, integration, variety of data, analytic accelerator, infrastructure
    
IBM
Published By: IBM     Published Date: Jan 09, 2014
Simply put, software defined storage is the abstraction of storage services from storage hardware. This term is more than just marketing hype, itís the logical evolution of storage virtualization from simply being a storage aggregator to the end goal of storage as a service. To achieve this goal, software defined storage needs a platform from which to centralize.
Tags : 
ibm, storage switzerland, storage platform, sds, software defined storage, data center, storage virtualization, data protection, shared storage, storage capacity, centric systems, high performance databases, storage efficiency, range of solutions, vendor options, volume management, software selection, data service, data management, performance loss
    
IBM
Published By: IBM     Published Date: Mar 05, 2014
For many years, companies have been building data warehouses to analyze business activity and produce insights for decision makers to act on to improve business performance. These traditional analytical systems are often based on a classic pattern where data from multiple operational systems is captured, cleaned, transformed and integrated before loading it into a data warehouse. Typically, a history of business activity is built up over a number of years allowing organizations to use business intelligence (BI) tools to analyze, compare and report on business performance over time. In addition, subsets of this data are often extracted from data warehouses into data marts that have been optimized for more detailed multi-dimensional analysis.
Tags : 
ibm, big data, data, big data platform, analytics, data sources, data complexity, data volume, data generation, data management, storage, acceleration, business intelligence, data warehouse
    
IBM
Published By: Altiscale     Published Date: Oct 19, 2015
In this age of Big Data, enterprises are creating and acquiring more data than ever before. To handle the volume, variety, and velocity requirements associated with Big Data, Apache Hadoop and its thriving ecosystem of engines and tools have created a platform for the next generation of data management, operating at a scale that traditional data warehouses cannot match.
Tags : 
big data, analytics, nexgen, hadoop, apache
    
Altiscale
Published By: IBM     Published Date: Jun 20, 2014
Data volumes are getting out of control, but choosing the right ILG solution can be a huge challenge, with multiple stakeholders, numerous business processes, and extensive solution requirements. Use this requirements kit from CGOC to find the tools and technology you need.
Tags : 
ibm, ilg, information lifecycle governance, information management, ilg solutions, data storage, data management
    
IBM
Published By: IBM     Published Date: Apr 29, 2015
First generation warehouses were not designed to manage data at today's volume or variety. Coercing older technologies to satisfy new demands can be inefficient, burdensome and costly. Read how IBM PureData System for Analytics is built for simplicity and speed.
Tags : 
big data, data management, hardware, business intelligence
    
IBM
Published By: IBM     Published Date: Jul 08, 2015
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize its value, and integrate it at a more granular level than ever beforeófocusing on the individual transaction level, rather than on general summary data. As data volumes continue to explode, clients must take advantage of a fully scalable information integration architecture that supports any type of data integration technique such as ETL, ELT (also known as ETL Pushdown), data replication or data virtualization. Read this new whitepaper to learn about the seven essential elements needed to achieve the highest performance.
Tags : 
    
IBM
Published By: IBM     Published Date: Jul 08, 2015
Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : 
    
IBM
Published By: Arbor Networks     Published Date: Mar 06, 2013
This Frost & Sullivan white paper identifies what organizations need to know to protect their intellectual property and prepare for unexpected data breaches. Cyber threats continue to mutate and grow in volume. Read on to learn more.
Tags : 
defeating cyber threats, wider net, arbor networks, network security, improving ability to protect data, data security
    
Arbor Networks
Published By: Cisco     Published Date: Jan 15, 2015
While this dramatic growth occurs, projections call for the cloud to account for nearly two-thirds of data center traffic and for cloud-based workloads to quadruple over traditional servers. That adds another element to the picture: changing traffic patterns. Under a cloud model, a university, for example, can build its network to handle average traffic volumes but then offload data on heavier trafficked days to a public cloud service when demand dictates, such as when itís time to register for the next semester of classes.
Tags : 
cloud, growth, traffic, projection, account, network
    
Cisco
Published By: HP     Published Date: May 14, 2014
Your data is everywhere. Itís in the cloud, in virtual environments, in remote offices, and on mobile devices. Plus, thereís more of it, and much of it is business-critical, meaning you need to back up larger volumes of data than ever before without adding costs or bandwidth. Even more importantly, you need the ability to restore data quickly in the event of a disaster, failure, outage. The cost of data protection is higher now more than ever.
Tags : 
hewlett packard, cloud, mobile, remote, data recovery, disaster, failure
    
HP
Published By: Dell     Published Date: Sep 17, 2014
As the 3 Vís of big data: volume, velocity and variety continue to grow, so too does the opportunity for finance sector firms to capitalize on this data for strategic advantage. Read how.
Tags : 
dell, finance data, data storage, financial records, secure finance, cloud, finance cloud
    
Dell
Published By: IBM     Published Date: Dec 06, 2013
Partners and customers expect instantaneous response and continuous uptime from data systems, but the volume and velocity of data make it difficult to respond with agility. IBM PureData System for Transactions enable businesses to gear up and meet these challenges.
Tags : 
ibm, ibm puredata system, data, data mangement, database, integration, transactions, workload, availability, data system
    
IBM
Published By: Xiotech     Published Date: Apr 13, 2007
Organizations face a double whammy when it comes to data archive management. On one hand, the volume of digital and paper archives is growing exponentially. Industry analysts report that digital hard disk storage has grown 85 percent a year over the last eight years and 2.7 billion new sheets of paper are filed into folders every day.
Tags : 
data archive, data management, storage, data warehousing, archive management, document search, search and retrieval, document management, document archive, xiotech
    
Xiotech
Published By: IBM     Published Date: May 30, 2008
Not all data is created equal, however, and deriving business value from information dispersed over many different sources means integrating, analyzing and optimizing heterogeneous types and sources of information throughout its lifecycle. Find out how, when deployed together in the same environment, IBM TotalStorage Productivity Center, IBM System Storage SAN Volume Controller and IBM Tivoli Storage Manager deliver the core technologies need to help customers tier their storage based on classes of service.
Tags : 
information lifecycle, information management, tivoli, storage management, leveraging information, ibm, ibm li, li campaign
    
IBM
Published By: IBM     Published Date: Dec 03, 2008
As your data volume and value increases, reduce your risk with the IBM System Storage DS3400 Express. This direct-attach storage (DAS) or SAN solution is scalable and expandable to consolidate your data. It’s also flexible and affordable for small and midsize businesses. Get all the details at a glance in this data sheet.
Tags : 
ibm, express seller, system storage, ds3400, san solution, ds3400 express
    
IBM
Published By: Neterion     Published Date: Dec 05, 2006
The relentless growth of data and network-intensive applications such as digital imaging, multimedia content, and broadcast/video continues to drive volumes of enterprise data and network traffic. As growth continues, IT managers are challenged with implementing solutions without interrupting critical business processes.
Tags : 
network infrastructure, traffic management, bandwidth management, bandwidth, network management, neterion
    
Neterion
Published By: Subrago     Published Date: Apr 30, 2009
The key objective of this white paper is to highlight the key issues and discuss processes and controls required to build a high performing IT support organization.
Tags : 
it support, subrago, it costs, customer transaction, high performing it support, it dependency, tolerance level, production services, change management, itil, itil v3, efficiency, it infrastructure, cab, change advisory board, segregation of duties, zero down, metrics, hardware performance, database performance
    
Subrago
Published By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: ē Overprovisioning their backup infrastructure to anticipate rapid future growth. ē Legacy systems canít cope and backups take too long or are incomplete. ē Companies miss recovery point objectives and recovery time targets. ē Backups overload infrastructure and network bandwidth. ē Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : 
backup infrastructure, legacy systems, overload infrastructure, cloud
    
Arcserve
Published By: SAS     Published Date: Aug 03, 2016
Data visualization is the visual and interactive exploration and graphic representation of data of any size, type (structured and unstructured) or origin. Visualizations help people see things that were not obvious to them before. Even when data volumes are very large, patterns can be spotted quickly and easily. Visualizations convey information in a universal manner and make it simple to share ideas with others.
Tags : 
best practices, data visualization, data, technology
    
SAS
Start   Previous    1 2 3 4 5 6 7 8    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.