data volume

Results 126 - 150 of 194Sort Results By: Published Date | Title | Company Name
Published By: IBM     Published Date: Aug 08, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media.
Tags : 
big data, analytics, insurance, customer service, solutions
    
IBM
Published By: IBM     Published Date: Aug 08, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media.
Tags : 
big data, analytics, insurance, customer service, solutions, business intelligence, market research, data management/analytics
    
IBM
Published By: AWS - ROI DNA     Published Date: Aug 09, 2018
In today's big data digital world, your organization produces large volumes of data with great velocity. Generating value from this data and guiding decision making require quick capture, analysis and action. Without strategies to turn data into insights, the data loses its value and insights become irrelevant. Real-time data inegration and analytics tools play a crucial role in harnessing your data so you can enable business and IT stakeholders to make evidence-based decisions
Tags : 
    
AWS - ROI DNA
Published By: Vertica     Published Date: Mar 15, 2010
In a world of growing data volumes and shrinking IT budgets, it is critical to think differently about the efficiency of your database and storage infrastructure. The Vertica Analytic Database is a high-performance, scalable and cost-effective solution that can bring dramatic savings in hardware, storage and operational costs.
Tags : 
vertica, ec2, cdr, elastic, saas, cdr, cloud computing, data management, ad-hoc, business intelligence, cloud, cloudbased applications, analytic, saas, dbms, mpp
    
Vertica
Published By: IBM     Published Date: Sep 10, 2009
With ever-increasing data growth worldwide, organizations must find smarter ways to store and manage their massive volumes of data. Learn how IBM® Tivoli® Storage Management software solutions help to maximize your current storage environment and reduce operational and capital costs while improving service and managing risks.
Tags : 
storage management, ibm, ibmtivoli, fastback, san, data growth, recovery management, reducing costs, improving service, managing risks, dynamic storage, infrastructure, virtualization, resource management
    
IBM
Published By: Dell Storage     Published Date: Jan 16, 2009
With the advent of iSCSI as the standard for networked storage, businesses can leverage existing skills and network infrastructure to create Ethernet-based SANs that deliver the performance of Fibre Channel—but at a fraction of the cost. iSCSI enables block-level data to be transported between a server and a storage device over an IP network. An iSCSI initiator is hardware or software that runs on a host and initiates I/O to an iSCSI target, which is a storage device (usually, a logical volume) that responds to read/write requests.
Tags : 
dell, iscsi, networked storage, initiator implementations, block-level data, fibre channel
    
Dell Storage
Published By: TeamQuest     Published Date: Apr 09, 2014
TeamQuest Director of Global Services Per Bauer explains how to manage services in relation to servers, storage, network, power and floor space. Understand costing data, incidents, business transaction volumes, and demand forecasts. Watch this short video to learn more about Optimization in a Box and how to quickly improve your ability to optimize business services today.
Tags : 
teamquest, virtualization, it professionals, optimize software, distributed environment, data center, performance
    
TeamQuest
Published By: CDW     Published Date: Aug 04, 2016
As data volumes grow, you need more than just storage space. Let us help you orchestrate a solution that brings you the scalability and agility you need to move your organization forward. Storage needs are changing rapidly, and legacy appliances and processes just can’t keep up. Old systems are running slowly and filling up fast. At CDW, we can help you evolve your storage with a smart solution that’s ready for what lies ahead.
Tags : 
data, technology, storage, best practices, best solutions
    
CDW
Published By: AWS     Published Date: Oct 12, 2018
Safeguarding your data is more important than ever. In today’s data-driven business landscape, companies are using their data to innovate, inform product improvements, and personalize services for their customers. The sheer volume of data collected for these purposes keeps growing, but the solutions available to organizations for processing and analyzing it become more efficient and intuitive every day. Reaching the right customers at the right time with the right offers has never been easier. With this newfound agility, however, comes new opportunities for vulnerability. With so much riding on the integrity of your data and the services that make it secure and available, it’s crucial to have a plan in place for unexpected events that can wipe out your physical IT environment or otherwise compromise data access. The potential for natural disasters, malicious software attacks, and other unforeseen events necessitates that companies implement a robust disaster recovery (DR) strategy to
Tags : 
    
AWS
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: AWS     Published Date: Nov 28, 2018
Financial institutions run on data: collecting it, analyzing it, delivering meaningful insights, and taking action in real time. As data volumes increase, organizations demand a scalable analytics platform that can meet the needs of data scientists and business users alike. However, managing an on-premises analytics environment for a large and diverse user base can become time-consuming, costly, and unwieldy. Tableau Server on Amazon Web Services (AWS) is helping major Financial Services organizations shift data visualization and analytics workloads to the cloud. The result is fewer hours spent on manual work and more time to ask deeper questions and launch new data analyses, with easily-scalable support for large numbers of users. In this webinar, you’ll hear how one major asset management company made the shift to cloud data visualization with Tableau Server on AWS. Discover lessons learned, best practices tailored to Financial Services organizations, and starting tactics for scalable analytics on the cloud.
Tags : 
    
AWS
Published By: CrowdStrike     Published Date: May 10, 2018
One of the biggest challenges to effectively stopping breaches lies in sifting through vast amounts of data to find the proverbial “needle in the haystack” – the subtle clues that indicate an attack is imminent or underway. As modern computer systems generate billions of events daily, the amount of data to analyze can reach petabytes. Compounding the problem, the data is often unstructured, discrete and disconnected. As a result, organizations struggle to determine how individual events may be connected to signal an impending attack. In this context, detecting attacks is often difficult, and sometimes impossible. This white paper describes how CrowdStrike solved this challenge by building its own graph data model – the CrowdStrike Threat Graph? – to collect and analyze extremely large volumes of security-related data, and ultimately, to stop breaches. This revolutionary approach applies massive graph-based technologies, similar to the ones developed by Facebook and Google, to detect k
Tags : 
    
CrowdStrike
Published By: TIBCO     Published Date: May 15, 2013
According to Forrester, most organizations today are only using 12% of their available data and only 37% of organizations are planning some type of big data technology project. At a time when companies are seeing volume of information increase quickly, it’s time to take a step back and look at the impact of big data. Join Mike Gualtieri, Principal Analyst at Forrester, for this webcast exploring the importance of integration in your big data initiatives. Discover how your ability to operate, make decisions, reduce risks and serve customers is inextricably linked to how well you’re able to handle your big data. Continue on to gain insight into: •3 key big data management activities you need to consider •Technologies you need to create for your big data ecosystem •A multi-dimensional view of the customer is the holy grail of individualization •Overcoming key integration challenges And more!
Tags : 
big data, integration, architecture, database, data warehousing, operations management
    
TIBCO
Published By: IBM     Published Date: May 27, 2014
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media
Tags : 
ibm, big data, analytics, insurance, insurance industry, big data solutions, integration, risk assessment, policy rates, customer retention, claims data, transaction data
    
IBM
Published By: IBM     Published Date: Feb 24, 2015
Big data and analytics help insurance companies identify the next best action for customers. With the right solutions, companies can extract, integrate and analyze a large volume and variety of data, from call-center notes and voice recordings to web chats, telematics and social media.
Tags : 
big data, ibm, claims operations, customer service
    
IBM
Published By: Tripp Lite     Published Date: Sep 30, 2015
This white paper: • Explains the staggering growth of digital data volume and the increasing demand for faster access • Examines the different types of data transmission • Outlines the two potential solutions for connecting 10Gb equipment with higher-speed equipment
Tags : 
equipment, higher-speed, data transmission, infrastructure, volume, integrate, network, speed, digital
    
Tripp Lite
Published By: Forte Wares     Published Date: Jun 24, 2014
The key to making big data initiatives a success lies within making the produced data more digestible and usable in decision making, rather than making it just ‘more,’ resulting in the creation of an environment wherein information is used to generate real impact. Put another way, the survival of Big Data is more about making the right data (not just higher volume) available to the right people (not just higher variety) at the right time (not just higher velocity).
Tags : 
    
Forte Wares
Published By: Arcserve     Published Date: May 29, 2015
Today, data volumes are growing exponentially and organizations of every size are struggling to manage what has become a very expensive and complex problem. It causes real issues such as: • Overprovisioning their backup infrastructure to anticipate rapid future growth. • Legacy systems can’t cope and backups take too long or are incomplete. • Companies miss recovery point objectives and recovery time targets. • Backups overload infrastructure and network bandwidth. • Not embracing new technologies, such as cloud backup, because there is too much data to transfer over wide area networks.
Tags : 
backup infrastructure, legacy systems, overload infrastructure, cloud
    
Arcserve
Published By: Exablox     Published Date: Jan 27, 2015
When it comes to the increasingly complex task of managing data storage, many small and midsize organizations face even greater challenges than large, global enterprises. Small and midsize companies have ever-increasing volumes of information to manage and secure, and they are confronting a number of difficulties when it comes to storage. Among the biggest hurdles: ›› Scaling storage as the business grows rapidly ›› Meeting the rising expense of data storage capacity ›› Dealing with the complexity of management and architecture ›› Devoting precious staff time managing storage and data backup Whereas larger organizations have significant IT budgets and staff to handle storage-related challenges, small and midsize companies lack the IT resources to dedicate to storage management. Fortunately, there are new approaches to data storage on the market that can help such companies address their data storage needs without requiring dedicated storage management resources, while at the same ti
Tags : 
scaling storage, data storage capacity, data backup, data protection, data management, exablox, oneblox
    
Exablox
Published By: Lucidworks     Published Date: Feb 12, 2015
Search is all your data, all the time, at scale. Read this white paper today and learn about Lucidworks Fusion, the next-generation search platform built on Apache Solr.
Tags : 
search platform, search query, search technologies, big data, high-volume data, database advancements
    
Lucidworks
Published By: IBM     Published Date: May 28, 2014
Different types of data have different data retention requirements. In establishing information governance and database archiving policies, take a holistic approach by understanding where the data exists, classifying the data, and archiving the data. IBM InfoSphere Optim™ Archive solution can help enterprises manage and support data retention policies by archiving historical data and storing that data in its original business context, all while controlling growing data volumes and improving application performance. This approach helps support long-term data retention by archiving data in a way that allows it to be accessed independently of the original application.
Tags : 
ibm, data retention, information governance, archiving, historical data, integrating big data, governing big data, integration, best practices, big data, ibm infosphere, it agility, performance requirements, hadoop, scalability, data integration, big data projects, high-quality data, leverage data replication, data persistence
    
IBM
Published By: Symantec     Published Date: Jul 11, 2017
Cloud Access Security Brokers (CASBs) serve as a critical control point to ensure the secure and compliant use of cloud apps and services. Cloud service providers typically maintain a shared responsibility policy for security—they guarantee the integrity of their service infrastructure, but the customer is responsible for securing actual app usage. In addition to the growing cloud security challenges organizations face to safeguard data and protect against threats in the cloud, total volume of cloud app adoption is accelerating, with most of it being done by business units and employees without approval or security oversight from the IT organization. As a result, CASB functionality has become so critical that by 2020 it is projected that 80% of enterprises will use a CASB solution. (Gartner)
Tags : 
cloud, cloud service providers, cloud security, cloud applications
    
Symantec
Published By: IBM     Published Date: Aug 05, 2014
There is a lot of discussion in the press about Big Data. Big Data is traditionally defined in terms of the three V’s of Volume, Velocity, and Variety. In other words, Big Data is often characterized as high-volume, streaming, and including semi-structured and unstructured formats. Healthcare organizations have produced enormous volumes of unstructured data, such as the notes by physicians and nurses in electronic medical records (EMRs). In addition, healthcare organizations produce streaming data, such as from patient monitoring devices. Now, thanks to emerging technologies such as Hadoop and streams, healthcare organizations are in a position to harness this Big Data to reduce costs and improve patient outcomes. However, this Big Data has profound implications from an Information Governance perspective. In this white paper, we discuss Big Data Governance from the standpoint of three case studies.
Tags : 
ibm, data, big data, information, healthcare, governance, technology
    
IBM
Published By: IBM     Published Date: Aug 28, 2014
Data volumes are getting out of control, but choosing the right information lifecycle governance solution can be a huge challenge, with multiple stakeholders, numerous business processes, and extensive solution requirements. Use this requirements kit from the Compliance, Governance and Oversight Council (CGOC) to find the tools and technology you need.
Tags : 
ilg, data volumes, cgoc, information economics
    
IBM
Published By: IBM     Published Date: Apr 18, 2016
Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : 
ibm, big data, trusted data, data management, data solutions
    
IBM
Start   Previous    1 2 3 4 5 6 7 8    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.