data volume

Results 1 - 25 of 189Sort Results By: Published Date | Title | Company Name
Published By: TeamQuest     Published Date: Apr 09, 2014
TeamQuest Director of Global Services Per Bauer explains how to manage services in relation to servers, storage, network, power and floor space. Understand costing data, incidents, business transaction volumes, and demand forecasts. Watch this short video to learn more about Optimization in a Box and how to quickly improve your ability to optimize business services today.
Tags : 
teamquest, virtualization, it professionals, optimize software, distributed environment, data center, performance
    
TeamQuest
Published By: Commvault     Published Date: Jul 06, 2016
Think of a wildfire that quickly spreads as it increases in speed and power. That is what is happening today as data growth increases the volume and management complexity of storage, backup and recovery. Now think of trying to stop that fire with a garden hose. Your traditional backup and recovery process is equally under-equipped to manage and facilitate operations that need more speed, efficiency, scalability and reliability to handle today’s 24/7, always-on environment. Here we examine the benefits of moving from a solution comprised of multiple point products to a holistic data protection platform designed to serve today’s enterprise.
Tags : 
commvault, data protection, storage, backup, recovery, holistic data protection, singe pane of glass, common code base, analytics, reporting
    
Commvault
Published By: Teradata     Published Date: May 02, 2017
A Great Use of the Cloud: Recent trends in information management see companies shifting their focus to, or entertaining a notion for the first time of a cloud-based solution. In the past, the only clear choice for most organizations has been on-premises data—oftentimes using an appliance-based platform. However, the costs of scale are gnawing away at the notion that this remains the best approach for all or some of a company’s analytical needs. This paper, written by McKnight Consulting analysts William McKnight and Jake Dolezal, describes two organizations with mature enterprise data warehouse capabilities, that have pivoted components of their architecture to accommodate the cloud.
Tags : 
data projects, data volume, business data, cloud security, data storage, data management, cloud privacy, encryption, security integration
    
Teradata
Published By: Resource     Published Date: Dec 04, 2018
What’s a common characteristic of the best talent? They all have jobs. In today’s marketplace, to get the best talent you have to convince them your opportunity is better than what they currently have. The good news: it can be done - and you can win consistently with a deliberate outbound process. The bad news: it requires an intentional approach which is challenging without the right tools and data. Your success depends on your ability to build a repeatable process to identify and recruit a steady volume of high quality applicants. The way to accelerate and scale your outbound process is to benchmark and refine it regularly using data. Below we’ll walk through the steps to building a data-driven recruiting process, based on.. you guessed it.. data.
Tags : 
    
Resource
Published By: McAfee     Published Date: Nov 07, 2014
Segundo o relatório “Agulha em um palheiro de dados” (Needle in a Datastack), as empresas estão vulneráveis a violações de segurança porque não são capazes de analisar ou armazenar adequadamente o Big Data. Esses volumes cada vez maiores de eventos, bem como de dados sobre ativos, ameaças, usuários e outros dados relevantes, criaram um grande desafio para as equipes de segurança em relação ao Big Data. Para solucionar esse desafio, as empresas abandonaram as arquiteturas tradicionais de gerenciamento de dados para adotar sistemas dedicados ao gerenciamento de dados de segurança na era das APTs (ameaças persistentes avançadas).
Tags : 
siem, big security data, segurança do big data, informações de segurança, ameaças avançadas, ameaças persistentes avançadas, apt, inteligência de segurança, segurança
    
McAfee
Published By: Nice Systems     Published Date: Feb 26, 2019
NICE has made a significant investment into AI and ML techniques that are embedded into its core workforce management solution, NICE WFM. Recent advancements include learning models that find hidden patterns in the historical data used to generate forecasts for volume and work time. NICE WFM also has an AI tool that determines, from a series of more than 40 models, which single model will produce the best results for each work type being forecasted. NICE has also included machine learning in its scheduling processes which are discussed at length in the white paper.
Tags : 
    
Nice Systems
Published By: Dell EMC     Published Date: Nov 03, 2016
IT managers are struggling to keep up with the “always available” demands of the business. Data growth and the nearly ubiquitous adoption of server virtualization among mid-market and enterprise organizations are increasing the cost and complexity of storage and data availability needs. This report documents ESG Lab testing of Dell EMC Storage SC Series with a focus on the value of enhanced Live Volume support that provides always-available access with great ease of use and economics.
Tags : 
storage, data, sql, architecture
    
Dell EMC
Published By: Red Hat, Inc.     Published Date: Jul 12, 2012
Recently, enterprises have seen enormous gains in scalability, flexibility, and affordability as they migrated from proprietary, monolithic server architectures to architectures that are virtualized, open source, standardized, and commoditized.
Tags : 
scalability, flexibility, affordability, performance, ease-of-use, reduced acqisition costs, reduced maintenance costs, red hat storage server, cloud, cloud computing, public cloud, private cloud, open cloud, data center, amazon web serivices, aws, unified file, unified object, n-way local synchronous replication, elastic volume management
    
Red Hat, Inc.
Published By: IBM     Published Date: Jan 02, 2014
This study highlights the phases of the big data journey, the objectives and challenges of midsize organizations taking the journey, and the current state of the technology that they are using to drive results. It also offers a pragmatic course of action for midsize companies to take as they dive into this new era of computing.
Tags : 
ibm, analytics, global business service, big data, business value, it professionals, volume, velocity, variety, customer analytics, trends and insights, information management, data security, integration, variety of data, analytic accelerator, infrastructure
    
IBM
Published By: IBM     Published Date: Mar 05, 2014
For many years, companies have been building data warehouses to analyze business activity and produce insights for decision makers to act on to improve business performance. These traditional analytical systems are often based on a classic pattern where data from multiple operational systems is captured, cleaned, transformed and integrated before loading it into a data warehouse. Typically, a history of business activity is built up over a number of years allowing organizations to use business intelligence (BI) tools to analyze, compare and report on business performance over time. In addition, subsets of this data are often extracted from data warehouses into data marts that have been optimized for more detailed multi-dimensional analysis.
Tags : 
ibm, big data, data, big data platform, analytics, data sources, data complexity, data volume, data generation, data management, storage, acceleration, business intelligence, data warehouse
    
IBM
Published By: Xiotech     Published Date: Apr 13, 2007
Organizations face a double whammy when it comes to data archive management. On one hand, the volume of digital and paper archives is growing exponentially. Industry analysts report that digital hard disk storage has grown 85 percent a year over the last eight years and 2.7 billion new sheets of paper are filed into folders every day.
Tags : 
data archive, data management, storage, data warehousing, archive management, document search, search and retrieval, document management, document archive, xiotech
    
Xiotech
Published By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge It is not uncommon for SAP system copies, including any post-editing, to take several days to complete. Meanwhile, testing, development and training activities come to a standstill, and the large number of manual tasks in the entire process ties up highly skilled SAP BASIS staff. Opportunity Enterprises are looking to automation as a way to accelerate SAP system copies and free up staff. However, this is only one part of the problem: What further complicates the system copy process is the need to safeguard sensitive data and manage huge data volumes while also ensuring that the data used in non-production systems adequately reflects the data in production systems so the quality of development, testing and training activities is not compromised. Benefits This white paper explains how a considerable portion of the SAP system copy process can be automated using the CA Automic Automated System Copy for SAP solution and SNP T-Bone, helping enterprises become more agile.
Tags : 
    
CA Technologies_Business_Automation
Published By: AWS - ROI DNA     Published Date: Aug 09, 2018
In today's big data digital world, your organization produces large volumes of data with great velocity. Generating value from this data and guiding decision making require quick capture, analysis and action. Without strategies to turn data into insights, the data loses its value and insights become irrelevant. Real-time data inegration and analytics tools play a crucial role in harnessing your data so you can enable business and IT stakeholders to make evidence-based decisions
Tags : 
    
AWS - ROI DNA
Published By: HP - Enterprise     Published Date: Jun 04, 2013
Businesses are overwhelmed with data; it’s a blessing and a curse. A curse because it can overwhelm traditional approaches to storing and processing it. A blessing because the data promises business insight that never existed earlier. The industry has spawned a new term, “big data,” to describe it. Now, IT itself is overwhelmed with its own big data. In the press to roll out new services and technologies—mobility, cloud, virtualization—applications, networks, and physical and virtual servers grow in a sprawl. With them comes an unprecedented volume of data such as logs, events, and flows. It takes too much time and resources to sift through it, so most of it lies unexplored and unexploited. Yet like business data, it contains insight that can help us solve problems, make decisions, and plan for the future.
Tags : 
data research, big data, virtualization, applications, networks
    
HP - Enterprise
Published By: MarkLogic     Published Date: Jun 09, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic
Tags : 
    
MarkLogic
Published By: MarkLogic     Published Date: Nov 07, 2017
Today, data is big, fast, varied and constantly changing. As a result, organizations are managing hundreds of systems and petabytes of data. However, many organizations are unable to get the most value from their data because they’re using RDBMS to solve problems they weren’t designed to fix. Why change? In this white paper, we dive into the details of why relational databases are ill-suited to handle the massive volumes of disparate, varied, and changing data that organizations have in their data centers. It is for this reason that leading organizations are going beyond relational to embrace new kinds of databases. And when they do, the results can be dramatic.
Tags : 
    
MarkLogic
Published By: MarkLogic     Published Date: Mar 13, 2015
Big Data has been in the spotlight recently, as businesses seek to leverage their untapped information resources and win big on the promise of big data. However, the problem with big data initiatives are that organizations try using existing information management practices and legacy relational database technologies, which often collapse under the sheer weight of the data. In this paper, MarkLogic explains how a new approach is needed to handle the volume, velocity, and variety of big data because the current relational model that has been the status quo is not working. Learn about the NoSQL paradigm shift, and why NoSQL is gaining significant market traction because it solves the fundamental challenges of big data, achieving better performance, scalability, and flexibility. Learn how MarkLogic’s customers are reimagining their data to: - Make the world more secure - Provide access to valuable information - Create new revenue streams - Gain insights to increase market share - Reduce b
Tags : 
enterprise, nosql, relational, databases, data storage, management system, application, scalable
    
MarkLogic
Published By: Hewlett Packard Enterprise     Published Date: Oct 24, 2017
Big Data is not just a big buzzword. Government agencies have been collecting large amounts of data for some time and analyzing the data collected to one degree or another. Big data is a term that describes high volume, variety and velocity of information that inundates an organization on a regular basis. But it’s not the amount of data that’s important. It’s what organizations do with the data that matters. Big data can be analyzed for insights that lead to better decisions and better services.
Tags : 
cloud optimization, cloud efficiency, cloud management, cloud assurance, cloud visibility, enterprise management, data management
    
Hewlett Packard Enterprise
Published By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : 
ibm, data, big data, information, healthcare, governance, technology
    
IBM
Published By: TIBCO     Published Date: May 15, 2013
According to Forrester, most organizations today are only using 12% of their available data and only 37% of organizations are planning some type of big data technology project. At a time when companies are seeing volume of information increase quickly, it’s time to take a step back and look at the impact of big data. Join Mike Gualtieri, Principal Analyst at Forrester, for this webcast exploring the importance of integration in your big data initiatives. Discover how your ability to operate, make decisions, reduce risks and serve customers is inextricably linked to how well you’re able to handle your big data. Continue on to gain insight into: •3 key big data management activities you need to consider •Technologies you need to create for your big data ecosystem •A multi-dimensional view of the customer is the holy grail of individualization •Overcoming key integration challenges And more!
Tags : 
big data, integration, architecture, database, data warehousing, operations management
    
TIBCO
Published By: Datastax     Published Date: Aug 23, 2017
About 10 years ago big data was quickly becoming the next big thing. It surged in popularity, swooning into the tech world's collective consciousness and spawning endless start-ups, thought pieces, and investment funding, and big data's rise in the startup world does not seem to be slowing down. But something's been happening lately: big data projects have been failing, or have been sitting on a shelf somewhere and not delivering on their promises. Why? To answer this question, we need to look at big data's defining characteristic - or make that characteristics, plural - or what is commonly known as 'the 3Vs": volume, variety and velocity.
Tags : 
datastax, big data, funding
    
Datastax
Published By: HERE Technologies     Published Date: Jan 22, 2019
To improve safety and mobility across its 5,600km road network, the City of Toronto forged a partnered with HERE Technologies for the provision of traffic, incident, and historical traffic data. Access to this data allows the city authority to see exactly what’s happening on its roads and more easily and affectively run studies on improvement projects. This case study details how HERE Technologies enabled the City of Toronto’s transportation team to: Work smarter with comprehensive network coverage and accurate data to aid analysis Examine the impact of city projects without significant forward planning or expenditure Ensure travel volume models used to drive decision making are calibrated to represent real-world truths
Tags : 
public sector, urbanization, traffic management
    
HERE Technologies
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Published By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : 
cost effective, data storage, data collection, security, compliance, platform, big data, it resources
    
Amazon Web Services
Start   Previous   1 2 3 4 5 6 7 8    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.