database

Results 301 - 325 of 1148Sort Results By: Published Date | Title | Company Name
Published By: IBM     Published Date: Apr 18, 2017
The data integration tool market was worth approximately $2.8 billion in constant currency at the end of 2015, an increase of 10.5% from the end of 2014. The discipline of data integration comprises the practices, architectural techniques and tools that ingest, transform, combine and provision data across the spectrum of information types in the enterprise and beyond — to meet the data consumption requirements of all applications and business processes. The biggest changes in the market from 2015 are the increased demand for data virtualization, the growing use of data integration tools to combine "data lakes" with existing integration solutions, and the overall expectation that data integration will become cloud- and on-premises-agnostic.
Tags : 
data integration, data security, data optimization, data virtualization, database security, data analytics, data innovation
    
IBM
Published By: IBM     Published Date: Apr 18, 2017
Apache Hadoop technology is transforming the economics and dynamics of big data initiatives by supporting new processes and architectures that can help cut costs, increase revenue and create competitive advantage. An effective big data integration solution delivers simplicity, speed, scalability, functionality and governance to produce consumable data. To cut through this misinformation and develop an adoption plan for your Hadoop big data project, you must follow a best practices approach that takes into account emerging technologies, scalability requirements, and current resources and skill levels.
Tags : 
data integration, data security, data optimization, data virtualization, database security, data migration, data assets, data delivery
    
IBM
Published By: IBM     Published Date: Oct 01, 2016
In the era of always-on business, enterprises need reliable, secure and consistently fast access at all times. Overlay that with the reality of how organizations combine on-premises systems with cloud-based solutions, and it’s clear that a robust, agile and flexible database platform is mandatory.
Tags : 
cloud optimization, cloud efficiency, cloud management, cloud assurance, cloud visibility, enterprise management, data management
    
IBM
Published By: IBM     Published Date: May 09, 2017
How companies are managing growth, gaining insights and cutting costs in the era of big data
Tags : 
data management, data system, business development, software integration, resource planning, enterprise management, data collection
    
IBM
Published By: IBM     Published Date: Jun 20, 2017
Watch this on-demand webinar for a look at in-memory databases and better understand the advantages of in-memory vs. disk-based technology.
Tags : 
data integration, data security, data optimization, data virtualization, in-memory, disk-based, technology
    
IBM
Published By: IBM     Published Date: Jul 21, 2016
IBM's recently released DB2 version 11.1 for Linux, Unix and Windows (LUW) is a hybrid database that IBM says can handle transactional and analytic workloads thanks to its BLU Acceleration technology, which features an in-memory column store for analytical workloads that can scale across a massively parallel cluster.
Tags : 
ibm, db2. analytics, mpp, data wharehousing
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
In the era of always-on business, enterprises need reliable, secure and consistently fast access at all times. Overlay that with the reality of how organizations combine on-premises systems with cloud-based solutions, and it’s clear that a robust, agile and flexible database platform is mandatory.
Tags : 
reliability, security, consistency, database, platform, access
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : 
memory analytics, database, efficiency, acceleration technology, aggregate data
    
IBM
Published By: IBM     Published Date: Jul 06, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. In many cases, these systems are no longer up to the task—so it’s time to make a decision. Do you use more staff to keep up with the fixes, patches, add-ons and continual tuning required to make your existing systems meet performance goals, or move to a new database solution so you can assign your staff to new, innovative projects that move your business forward?
Tags : 
database, growth, big data, it infrastructure, information management
    
IBM
Published By: IBM     Published Date: Sep 28, 2017
Here are the 6 reasons to change your database: Lower total cost of ownership Increased scalability and availability Flexibility for hybrid environments A platform for rapid reporting and analytics Support for new and emerging applications Greater simplicity Download now to learn more!
Tags : 
scalability, hybrid environment, emerging applications, rapid reporting
    
IBM
Published By: IBM     Published Date: Sep 28, 2017
For increasing numbers of organizations, the new reality for development, deployment and delivery of applications and services is hybrid cloud. Few, if any, organizations are going to move all their strategic workloads to the cloud, but virtually every enterprise is embracing cloud for a wide variety of requirements. This paper outlines what readers should consider when making a strategic commitment to a database platform that will act as a bridge from legacy environments to the cloud.
Tags : 
bridging the cloud, database, hybrid cloud, cloud organization, strategic commitment
    
IBM
Published By: IBM     Published Date: Sep 28, 2017
Welcome to the era of the digital enterprise, where digital is your journey and cognitive is your destination. As business leaders, you are under growing pressure to use information to its fullest potential, delivering new customer experiences as fuel for business growth. The digital economy is changing the way we gather information, gain insights, reinvent our businesses and innovate both quickly and iteratively A hybrid cloud environment, combining traditional IT systems and public cloud, enables you to extend business processes beyond the walls of your organization. For example, many organizations use public cloud as a collaborative development environment to create innovative applications that can then be ported to an on-premises or hybrid production environment.
Tags : 
cognitive era, digital, database, hybrid cloud environment, public cloud
    
IBM
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
As organizations develop next-generation applications for the digital era, many are using cognitive computing ushered in by IBM Watson® technology. Cognitive applications can learn and react to customer preferences, and then use that information to support capabilities such as confidence-weighted outcomes with data transparency, systematic learning and natural language processing. To make the most of these next-generation applications, you need a next-generation database. It must handle a massive volume of data while delivering high performance to support real-time analytics. At the same time, it must provide data availability for demanding applications, scalability for growth and flexibility for responding to changes.
Tags : 
database, applications, data availability, cognitive applications
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
For increasing numbers of organizations, the new reality for development, deployment and delivery of applications and services is hybrid cloud. Few, if any, organizations are going to move all their strategic workloads to the cloud, but virtually every enterprise is embracing cloud for a wide variety of requirements. To accelerate innovation, improve the IT delivery economic model and reduce risk, organizations need to combine data and experience in a cognitive model that yields deeper and more meaningful insights for smarter decisionmaking. Whether the user needs a data set maintained in house for customer analytics or access to a cloud-based data store for assessing marketing program results — or any other business need — a high-performance, highly available, mixed-load database platform is required.
Tags : 
cloud, database, hybrid cloud, database platform
    
Group M_IBM Q1'18
Published By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. This e-book presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : 
database, streamlining, it infrastructure, database systems
    
Group M_IBM Q1'18
Published By: IBM     Published Date: Apr 03, 2018
Can your database systems handle data growth and keep up with performance requirements? Here are six reasons to change.
Tags : 
database systems, data management, hybrid data
    
IBM
Published By: IBM     Published Date: Apr 03, 2018
This paper gives key considerations when making a strategic commitment to a database platform.
Tags : 
data management, cloud, database platform
    
IBM
Published By: IBM     Published Date: Apr 03, 2018
The next generation of IBM Db2 helps organizations get more value from their big data to improve their IT economics.
Tags : 
ibm, data management, db2
    
IBM
Published By: Oracle     Published Date: Jan 28, 2019
Oracle Engineered Systems are architected to work as a unified whole, so organizations can hit the ground running after deployment. Organizations choose how they want to consume the infrastructure: on-premises, in a public cloud, or in a public cloud located inside the customer’s data center and behind their firewall using Oracle’s “Cloud at Customer” offering. Oracle Exadata and Zero Data Loss Recovery Appliance (Recovery Appliance) offer an attractive alternative to do-it-yourself deployments. Together, they provide an architecture designed for scalability, simplified management, improved cost of ownership, reduced downtime, zero-data loss, and an increased ability to keep software updated with security and patching. Download this whitepaper to discover ten capabilities to consider for protecting your Oracle Database Environments.
Tags : 
    
Oracle
Published By: Oracle     Published Date: Jan 28, 2019
Traditionally, the best practice for mission-critical Oracle Database backup and recovery was to use storage-led, purpose-built backup appliances (PBBAs) such as Data Domain, integrated with RMAN, Oracle’s automated backup and recovery utility. This disk-based backup approach solved two problems: 1) It enabled faster recovery (from disk versus tape) 2) It increased recovery flexibility by storing many more backups online, enabling restoration from that data to recover production databases; and provisioning copies for test/dev. At its core, however, this approach remains a batch process that involves many dozens of complicated steps for backups and even more steps for recovery. Oracle’s Zero Data Loss Recovery Appliance (RA) customers report that total cost of ownership (TCO) and downtime costs (e.g. lost revenue due to database or application downtime) are significantly reduced due to the simplification and, where possible, the automation of the backup and recovery process.
Tags : 
    
Oracle
Published By: Oracle     Published Date: Jan 28, 2019
For more than a decade, Oracle has developed and enhanced its ZFS Storage Appliance, giving its users a formidable unified and enterprise-grade storage offering. The latest release, ZS7-2, boasts upgraded hardware and software and is a timely reminder that more users might do well to evaluate this offering. It has a trifecta of advantages: (1) It’s notable performance, price-performance, and flexibility are all improved in this new release (2) There is a surprisingly inclusive set of functionalities, including excellent storage analytics that were developed even before analytics became a contemporary “must-have” (3) There’s a compelling group of “better together” elements that make ZFS Storage Appliance a particularly attractive choice for both Oracle Database environments and users that want to seamlessly integrate a cloud component into their IT infrastructure. Given the proven abilities of Oracle’s prior models, it’s also safe to assume that the new ZS7-2 will outperform other m
Tags : 
    
Oracle
Published By: Oracle     Published Date: Jan 28, 2019
Databases tend to hold an organization’s most important information and power the most crucial applications. It only makes sense, then, to run them on a system that’s engineered specifically to optimize database infrastructure. Yet some companies continue to run their databases on do-it-yourself (DIY) infrastructure, using separate server, software, network, and storage systems. It’s a setup that increases risk, cost, complexity, and time spent deploying and managing the systems, given that it typically involves at least three different IT groups.
Tags : 
    
Oracle
Published By: Intel     Published Date: Apr 16, 2019
Gartner predicts that the public cloud market will surpass USD 300 billion by 2021 . With the big players (Amazon, Google, Microsoft and IBM) taking home 63 percent of the market share , how will next wave CSPs stand out from the crowd? Download Intel's latest whitepaper, Differentiating for Success: A Guide for Cloud Service Providers' to discover how to offer unique services, including: - Providing workload-specific optimizations, for example machine learning or high-performance computing - Targeting a particular geographical area - Focusing on an industry, such as financial services - Delivering emerging technology, such as virtual reality, in-memory databases, and containerization
Tags : 
    
Intel
Published By: Intel     Published Date: Apr 16, 2019
The data center is coming under immense pressure. The boom in connected devices means increasing volumes of data – and all that needs processing. One way for CSPs to accelerate customer workloads is by using FPGAs, which are easier to use than ever before. Download Intel's latest eGuide, ‘FPGA-as-a-Service: A Guide for CSPs' to discover: • How to add FPGAs to the data center • The structure of the Intel® Acceleration Stack for FPGAs • Adding off-the-shelf accelerator functions • How FPGAs can accelerate many cloud services, such as database as a service and analytics as a service
Tags : 
    
Intel
Published By: Symantec     Published Date: Nov 24, 2014
This IDC Executive Brief will discuss the evolution and challenges of data protection for virtual environments and how a modern data protection solution can enable both virtualization professionals and storage managers to perform successful backups, but more importantly guaranteed restores. Benefits and challenges of data protection for virtual environments will be discussed, as well as emerging best practices for unified data protection.
Tags : 
unified data protection, database security
    
Symantec
Start   Previous    6 7 8 9 10 11 12 13 14 15 16 17 18 19 20    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.