data replication

Results 1 - 25 of 82Sort Results By: Published Date | Title | Company Name
Published By: Silver Peak     Published Date: Dec 31, 9999
Offsite data replication is key to ensuring ongoing business operations, but it can be complex and costly, especially when performed over long distances. Join this discussion to discover how you can apply fast, cost effective and reliable remote replication that can: Meet Recovery Point Objectives (RPO) by reducing remote replication times by up to 20X Reduce bandwidth costs and extend replication distances Lower storage costs while increasing storage flexibility Leverage emerging cloud and virtualization technologies for better offsite disaster recovery Hear from the experts and users at Dell Compellent, Silver Peak and AlaskaUSA discuss essential data replication strategies and technologies.
Tags : 
offsite data, remote replication, reliable, storage flexibility, cloud, virtualization, technology, offsite disaster recovery, security, knowledge management, storage
    
Silver Peak
Published By: DataCore     Published Date: Jun 06, 2019
More than 54% of companies have experienced a downtime event that lasted more than 8 hours one full work day in the past 5 years. And the risk of major disruptions to business continuity practices looms ever larger primarily due to the troubling dependencies on the location, topology and suppliers of data storage. Download this infographic and see how you can preserve proven Business Continuity practices despite inevitable changes to your data storage infrastructure.
Tags : 
business continuity, disaster recovery, data protection, software defined storage, virtualization, storage, downtime, high availability, replication, storage infrastructure
    
DataCore
Published By: DataCore     Published Date: Jun 06, 2019
With so many moving pieces involved in business continuity and disaster recovery planning, anticipating the downstream impact of new SAN arrays, hyperconverged systems, and disaster recovery site relocation is increasingly more difficult. Learn how DataCore SDS services help you adapt your safeguards accordingly in the face of these common initiatives and business challenges: - Storage Array Replacement - Mergers and Acquisitions - Cloud Replication - Hyperconverged Infrastructure - Multi-Site Metro Cluster
Tags : 
business continuity, disaster recovery, data protection, software defined storage, virtualization, storage, downtime, high availability, replication, storage infrastructure, cloud, hyperconverged
    
DataCore
Published By: DataCore     Published Date: Jun 06, 2019
Nothing in Business Continuity circles ranks higher in importance than risk reduction. Yet the risk of major disruptions to business continuity practices looms ever larger today, mostly due to the troubling dependencies on the location, topology and suppliers of data storage. Get insights on how to avoid spending time and money reinventing BC/DR plans every time your storage infrastructure changes.
Tags : 
business continuity, disaster recovery, data protection, software defined storage, virtualization, storage, downtime, high availability, replication, storage infrastructure
    
DataCore
Published By: Attunity     Published Date: Feb 12, 2019
This technical whitepaper by Radiant Advisors covers key findings from their work with a network of Fortune 1000 companies and clients from various industries. It assesses the major trends and tips to gain access to and optimize data streaming for more valuable insights. Read this report to learn from real-world successes in modern data integration, and better understand how to maximize the use of streaming data. You will also learn about the value of populating a cloud data lake with streaming operational data, leveraging database replication, automation and other key modern data integration techniques. Download this whitepaper today for about the latest approaches on modern data integration and streaming data technologies.
Tags : 
streaming data, cloud data lakes, cloud data lake, data lake, cloud, data lakes, streaming data, change data capture, cloud computing, modern data integration, data integration, data analytics, cloud-based data lake, enterprise data, self-service data
    
Attunity
Published By: Attunity     Published Date: Feb 12, 2019
Read this technical whitepaper to learn how data architects and DBAs can avoid the struggle of complex scripting for Kafka in modern data environments. Youll also gain tips on how to avoid the time-consuming hassle of manually configuring data producers and data type conversions. Specifically, this paper will guide you on how to overcome these challenges by leveraging innovative technology such as Attunity Replicate. The solution can easily integrate source metadata and schema changes for automated configuration real-time data feeds and best practices.
Tags : 
data streaming, kafka, metadata integration, metadata, data streaming, apache kafka, data integration, data analytics, database transactions, streaming environments, real-time data replication, data configuration
    
Attunity
Published By: Attunity     Published Date: Nov 15, 2018
With the opportunity to leverage new analytic systems for Big Data and Cloud, companies are looking for ways to deliver live SAP data to platforms such as Hadoop, Kafka, and the Cloud in real-time. However, making live production SAP data seamlessly available wherever needed across diverse platforms and hybrid environments often proves a challenge. Download this paper to learn how Attunity Replicates simple, real-time data replication and ingest solution can empower your team to meet fast-changing business requirements in an agile fashion. Our universal SAP data availability solution for analytics supports decisions to improve operations, optimize customer service, and enable companies to compete more effectively.
Tags : 
    
Attunity
Published By: Attunity     Published Date: Nov 15, 2018
Change data capture (CDC) technology can modernize your data and analytics environment with scalable, efficient and real-time data replication that does not impact production systems. To realize these benefits, enterprises need to understand how this critical technology works, why its needed, and what their Fortune 500 peers have learned from their CDC implementations. This book serves as a practical guide for enterprise architects, data managers and CIOs as they enable modern data lake, streaming and cloud architectures with CDC. Read this book to understand: ? The rise of data lake, streaming and cloud platforms ? How CDC works and enables these architectures ? Case studies of leading-edge enterprises ? Planning and implementation approaches
Tags : 
optimize customer service
    
Attunity
Published By: Datastax     Published Date: Aug 15, 2018
Nobody likes being bogged down in operations, least of all database administrators who have enough to worry about already. The new DataStax Enterprise (DSE) has made it especially easy to be a DBA. With features like NodeSync, Advanced Replication, and OpsCenter, DSE is Apache Cassandra made simple. Read this white paper to learn how DSE simplifies your operations.
Tags : 
    
Datastax
Published By: Datastax     Published Date: Aug 07, 2018
Nobody likes being bogged down in operations, least of all database administrators who have enough to worry about already. The new DataStax Enterprise (DSE) has made it especially easy to be a DBA. With features like NodeSync, Advanced Replication, and OpsCenter, DSE is Apache Cassandra made simple. Read this white paper to learn how DSE simplifies your operations.
Tags : 
    
Datastax
Published By: Cohesity     Published Date: May 04, 2018
Cohesity provides the only hyper-converged platform that eliminates the complexity of traditional data protection solutions by unifying your end-to-end data protection infrastructure including target storage, backup, replication, disaster recovery, and cloud tiering. Cohesity DataPlatform provides scale-out, globally deduped, highly available storage to consolidate all your secondary data, including backups, files, and test / dev copies. Cohesity also provides Cohesity DataProtect, a complete backup and recovery solution fully converged with Cohesity DataPlatform. It simplifies backup infrastructure and eliminates the need to run separate backup software, proxies, media servers, and replication. This paper specifically focuses on the business and technical benefits of Cohesity DataPlatform for the data protection use case. It is intended for IT professionals interested in learning more about Cohesitys technology differentiation and advantages it offers for data protection - (i) Elim
Tags : 
    
Cohesity
Published By: Veeam '18     Published Date: Mar 13, 2018
Disaster recovery (DR) planning has a reputation for being difficult and time consuming. Setting up alternate processing sites, procuring hardware, establishing data replication, and failover testing have been incredibly expensive undertakings. To top it all off, the need for 24x7x365 business application availability threatens to make disaster recovery planning an exercise in futility.
Tags : 
draas, business, applications, planning, business, optimization
    
Veeam '18
Published By: Cisco EMEA     Published Date: Nov 13, 2017
The HX Data Platform uses a self-healing architecture that implements data replication for high availability, remediates hardware failures, and alerts your IT administrators so that problems can be resolved quickly and your business can continue to operate. Space-efficient, pointerbased snapshots facilitate backup operations, and native replication supports cross-site protection. Data-at-rest encryption protects data from security risks and threats. Integration with leading enterprise backup systems allows you to extend your preferred data protection tools to your hyperconverged environment.
Tags : 
hyperflex, systems, data platform, storage efficiency, business, cisco
    
Cisco EMEA
Published By: IBM     Published Date: Aug 23, 2017
To compete in todays fast-paced business climate, enterprises need accurate and frequent sales and customer reports to make real-time operational decisions about pricing, merchandising and inventory management. They also require greater agility to respond to business events as they happen, and more visibility into business activities so information and systems are optimized for peak efficiency and performance. By making use of data capture and business intelligence to integrate and apply data across the enterprise, organizations can capitalize on emerging opportunities and build a competitive advantage.
Tags : 
ibm, data replication, inventory management, competitive advantage
    
IBM
Published By: Carbonite     Published Date: Aug 02, 2017
The risk and downtime of server migration have locked many IT teams into platforms that are no longer ideal. Migrations are often delayed or avoidedwhether its moving data to the cloud or back on premises, or upgrading the hardware of the database software. This results in lost opportunities, unnecessary costs, and a lack of agility that todays IT teams can no longer afford. Carbonite Move Powered by DoubleTake quickly and easily migrates physical, virtual, and cloud workloads over any distance with minimal risk and near-zero downtime. Using efficient real-time, byte-level replication technology, Carbonite Move creates a replica of the data, application, database, or entire server being migrated and keeps it in sync with the production system. The migrated data can be validated without disrupting business operations, and downtime is limited to the seconds or minutes required for cutover to the new server.
Tags : 
carbonite, doubletake, risk mitigation
    
Carbonite
Published By: IBM     Published Date: Jul 26, 2017
To compete in todays fast-paced business climate, enterprises need accurate and frequent sales and customer reports to make real-time operational decisions about pricing, merchandising and inventory management. They also require greater agility to respond to business events as they happen, and more visibility into business activities so information and systems are optimized for peak efficiency and performance. By making use of data capture and business intelligence to integrate and apply data across the enterprise, organizations can capitalize on emerging opportunities and build a competitive advantage. The IBM data replication portfolio is designed to address these issues through a highly flexible one-stop shop for high-volume, robust, secure information replication across heterogeneous data stores. The portfolio leverages real-time data replication to support high availability, database migration, application consolidation, dynamic warehousing, master data management (MDM), service
Tags : 
ibm, infosphere, data replication, security, data storage
    
IBM
Published By: IBM     Published Date: Mar 30, 2017
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize its value, and integrate it at a more granular level than ever beforefocusing on the individual transaction level, rather than on general summary data. As data volumes continue to explode, clients must take advantage of a fully scalable information integration architecture that supports any type of data integration technique such as extract, transfer and load (ETL), data replication or data virtualization.
Tags : 
data integration, data security, data optimization, data virtualization, database security
    
IBM
Published By: IBM     Published Date: Nov 30, 2016
Covering both mobile and Internet of Things (IoT) use cases, this deep dive into offline first explored several patterns for using PouchDB together with Cloudant, including setting up one database per user, one database per device, read-only replication, and write-only replication.
Tags : 
ibm, cloud, offline-first, pouchdb, ibm cloudant, enterprise applications
    
IBM
Published By: IBM     Published Date: Sep 14, 2015
This analysis outlines the necessity of object storage in today's digitally-oriented market while highlighting Cleversafe as a solution.
Tags : 
ibm, cleversafe, object storage, storage, enterprise storage, raid, data migration, data replication, migration, network architecture, small business networks, backup and recovery, storage management, data deduplication, infrastructure
    
IBM
Published By: IBM     Published Date: Jul 08, 2015
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize its value, and integrate it at a more granular level than ever beforefocusing on the individual transaction level, rather than on general summary data. As data volumes continue to explode, clients must take advantage of a fully scalable information integration architecture that supports any type of data integration technique such as ETL, ELT (also known as ETL Pushdown), data replication or data virtualization. Read this new whitepaper to learn about the seven essential elements needed to achieve the highest performance.
Tags : 
    
IBM
Published By: Basho     Published Date: Apr 07, 2015
This whitepaper looks at why companies choose Riak over a relational database. We focus specifically on availability, scalability, and the key/value data model. Then we analyze the decision points that should be considered when choosing a non-relational solution and review data modeling, querying, and consistency guarantees. Finally, we end with simple patterns for building common applications in Riak using its key/value design, dealing with data conflicts that emerge in an eventually consistent system, and discuss multi-datacenter replication.
Tags : 
basho, riak, relational database, nosql, common applications, simple deployments, it management, data management
    
Basho
Published By: VMTurbo     Published Date: Mar 25, 2015
Intelligent N+X Redundancy, Placement Affinities, & Future Proofing in the Virtualized Data Center Virtualization brought about the ability to simplify business continuity management in IT. Workload portability and data replication capabilities mean that physical infrastructure failures no longer need impact application services, and they can rapidly be recovered even in the event of complete site failure. However, Enterprises and Service Providers face new challenges ensuring they have enough compute capacity in their virtualized data centers to support their business continuity requirements, while at the same time not over provisioning infrastructure capacity resulting in unnecessary capital expenditure.
Tags : 
continuity requirements, vmturbo, virtualization, workload portability, data replication, it management, knowledge management, enterprise applications
    
VMTurbo
Published By: Zerto     Published Date: Oct 22, 2014
A simpler approach to DR! Compare hypervisor-based replication to guest/OS-based and array-based replication. Read about Zertos easy integration with VMware.
Tags : 
zerto, vmware, virtualization, data center, virtual replication, replication, business continuity, disaster recovery, storage virtualization, data replication, software compliance, data center design and management
    
Zerto
Published By: Riverbed     Published Date: Aug 19, 2014
EMC RecoverPoint family provides cost-effective, local continuous data protection and/or continuous remote replication solutions that allow for any-point-in-time data recovery. Riverbed SteelHead WAN Optimization solutions deliver maximum performance for business applications and data transfers over Wide-Area Networks (WANs).
Tags : 
data protection, performance, optimization, data transfer, wan, networking, it management
    
Riverbed
Published By: Riverbed     Published Date: Aug 19, 2014
SRDF protocol header allowing Riverbed Data Streamlining optimization to operate more effectively. Riverbed Selective Optimization for SRDF/A provides even greater performance gains by applying specific optimization policies to specific SRDF replication groups and avoid learning unwanted byte patterns, increasing Riverbed Data Streamlining efficiency and increasing overall application throughput.
Tags : 
srdf, data streamlining, optimization, byte, application throughput, networking
    
Riverbed
Start   Previous   1 2 3 4    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.