storage and processing

Results 1 - 18 of 18Sort Results By: Published Date | Title | Company Name
Published By: Cisco EMEA     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : 
big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
    
Cisco EMEA
Published By: AWS     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
AWS
Published By: Oracle     Published Date: Aug 09, 2018
The purpose of IT backup and recovery systems is to avoid data loss and recover quickly, thereby minimizing downtime costs. Traditional storage-centric data protection architectures such as Purpose Built Backup Appliances (PBBAs), and the conventional backup and restore processing supporting them, are prone to failure on recovery. This is because the processes, both automated and manual, are too numerous, too complex, and too difficult to test adequately. In turn this leads to unacceptable levels of failure for today’s mission critical applications, and a poor foundation for digital transformation initiatives. Governments are taking notice. Heightened regulatory compliance requirements have implications for data recovery processes and are an unwelcome but timely catalyst for companies to get their recovery houses in order. Onerous malware, such as ransomware and other cyber attacks increase the imperative for organizations to have highly granular recovery mechanisms in place that allow
Tags : 
    
Oracle
Published By: Dell PC Lifecycle     Published Date: Mar 09, 2018
When your company’s work demands a new storage array, you have the opportunity to invest in a solution that can support demanding workloads simultaneously—such as online transaction processing (OLTP) and data mart loading. At Principled Technologies, we compared Dell EMC™ PowerEdge™ R930 servers1 with the Intel® Xeon® Processor Dell EMC Unity 400F All Flash storage array to HPE ProLiant DL580 Gen9 servers with the HPE 3PAR 8400 array in three hands-on tests to determine how well each solution could serve a company during these database-intensive tasks. Intel Inside®. New Possibilities Outside.
Tags : 
    
Dell PC Lifecycle
Published By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : 
reporting, artificial intelligence, insights, organization, institution, recognition
    
Pure Storage
Published By: NetApp     Published Date: Feb 19, 2015
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
    
NetApp
Published By: NetApp     Published Date: Sep 22, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
flash pool, fas storage systems, ssd, online transaction processing, cluster storage
    
NetApp
Published By: IBM APAC     Published Date: Mar 19, 2018
Finnish telecom giant DNA’s vision is to have the most satisfied customers. They achieve this with Flash storage by accelerating daily reports on customer preferences and making agile business decisions accordingly. Read how they use IBM Flash Storage to cut its report processing by 66%, enabling it to provide the insights it needs to deliver the most relevant and valuable experiences to its subscribers.
Tags : 
    
IBM APAC
Published By: AWS     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
AWS
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services
Published By: HP     Published Date: Oct 09, 2008
Despite its six syllables, 'virtualization' is a straight - forward concept. It can enable your organization to get more value not only from computer and storage hardware but also from the labor required to keep your systems up and running. Virtualization is an approach to pooling and sharing IT resources so the supply of resources—processing power, storage, networking and so on—can flexibly and automatically meet fluctuating business demand. Virtualization can improve the quality of your IT services, enabling more consistency and predictability of operational availability.
Tags : 
mission critical, virtualization
    
HP
Published By: Veeam '18     Published Date: Dec 04, 2018
"The new EU General Data Protection Regulation (GDPR) is the most important change in data privacy regulation in 22 years and it will have a profound impact on every organization that relies on the storage and processing of personal data of EU citizens. Starting May 25, 2018, the penalties begin for noncompliance, with GDPR fines of up to 4% of annual global revenue or 20 million euros - whichever is greater. As this new regulation also impacts Veeam®, we wanted to share our insights on our road to compliancy. In a new executive brief, GDPR: 5 Lessons Learned, Veeam Compliance Experience Shared, we walk through these lessons and share how our software played a critical role within data management and protection strategies to ensure we remain compliant while delivering Availability for the Always On Enterprise™."
Tags : 
    
Veeam '18
Published By: HPE Intel     Published Date: Feb 19, 2016
The rising demands for capturing and managing video surveillance data are placing new pressures on state and local governments, law enforcement agencies, and education officials. The challenges go beyond the expanding costs of equipment, storage, software and management time. Officials must also lay the right foundation to scale their capabilities, improve performance and still remain flexible in a rapidly evolving ecosystem of surveillance tools. Find out what state, local government and education IT leaders need to know, and what steps you can take to: • Improve performance and ROI on cameras-per-server for the dollars invested • Optimize and simplify the management of daily surveillance processing, including the integration of facial recognition, redaction and analysis software • Fortify reliability and dependability to reduce the risk of surveillance data retrieval failures.
Tags : 
    
HPE Intel
Published By: Intralinks     Published Date: Mar 12, 2014
In the age of Edward Snowden and the NSA, there are increasing concerns about data privacy and especially where best to keep data secure. The prevalence of cloud computing and cloud-based storage and collaboration services is only exacerbating these concerns. Many organizations are confused about regulations that protect data in different countries and jurisdictions, and don’t know what steps to take to ensure their cloud collaboration vendor can provide adequate safeguards. While pushback and reform efforts are evolving, the reality is that companies must operate within the law. Deciding where to house your data and how to move it is an exercise in understanding the relevant legal regimes and the application of risk analysis in the decision making process. This white paper will examine those elements with regard to cloud-based storage and processing.
Tags : 
data privacy, technology, data, security, safeguards, cloud computing
    
Intralinks
Published By: HP     Published Date: Aug 25, 2014
At first, virtualization efforts focused on optimization – making the server more efficient. These efforts transitioned into standardizing virtualization across the data center, including management of servers, storage and networking. Now customers are looking to use virtualization technology and all supporting infrastructure as a complete and extremely efficient system.HP ConvergedSystem 300 for Virtualization has been designed to address one of today’s top priorities for IT organizations – deploying applications faster through virtualization but in a way that does not increase IT complexity. Pre-configured to meet the needs of mid-market customers, HP ConvergedSystem 300 for Virtualization offerings can be easily, reliably, and rapidly deployed to support a variety of virtualized application use cases such as general IT infrastructure, decision support, collaboration or business processing.
Tags : 
converged system, server, complexity, storage, networking, virtualization, data center, cloud, infrastructure, red hat, integrated, optimization, solution, standardizing
    
HP
Published By: Dell     Published Date: Aug 24, 2015
To extract value from an ever-growing onslaught of data, your organization needs next-generation data management, integration, storage and processing systems that allow you to collect, manage, store and analyze data quickly, efficiently and cost-effectively. That’s the case with Dell | Cloudera® Apache™ Hadoop® solutions for big data.
Tags : 
dell, data management, data integration, big data, hadoop
    
Dell
Published By: NEC     Published Date: Aug 26, 2014
In addition to high reliability and availability, enterprise mission critical applications, data centers operating 24x7, and data analysis platforms all demand powerful data processing capabilities and stability. The NEC PCIe SSD Appliance for Microsoft® SQL Server® is a best-practice reference architecture for such demanding workloads. It comprises an Express 5800 Scalable Enterprise Server Series with Intel® Xeon® processor E7 v2 family CPUs, high-performance HGST FlashMAX II PCIe server-mounted flash storage, and Microsoft® SQL Server® 2014. When compared with the previous reference architecture based on a server with the Intel® Xeon® processor E7 family CPUs, benchmark testing demonstrated a performance improvement of up to 173% in logical scan rate in a data warehouse environment. The testing also demonstrated consistently fast and stable performance in online transaction processing (OLTP) that could potentially be encountered.
Tags : 
sql, datacenter, servers, virtualization, customer value, analytics, application owners, system integrators, big data, reliability, enterprise, availability, serviceability, processor
    
NEC
Search      

Add Research

Get your company's research in the hands of targeted business professionals.