distributed model

Results 1 - 16 of 16Sort Results By: Published Date | Title | Company Name
Published By: Akamai Technologies     Published Date: Jun 14, 2018
"Traditional remote access technologies—like VPNs, proxies, and remote desktops—provide access in much the same way they did 20 years ago. However, new and growing business realities—like a growing mobile and distributed workforce—are forcing enterprises to take a different approach to address the complexity and security challenges that traditional access technologies present. Read 5 Reasons Enterprises Need a New Access Model to learn about the fundamental changes enterprises need to make when providing access to their private applications."
Tags : 
vpn, proxies, security, security breach, technology
    
Akamai Technologies
Published By: Datastax     Published Date: Aug 03, 2018
"Part of the “new normal” where data and cloud applications are concerned is the ability to handle multiple types of data models that exist in the application and persist each in a single datastore. This data management capability is called a “multi-model” database. Download this free white paper and explore the multi-model concept, its rationale, and how DataStax Enterprise (DSE) is the only database that can help accelerate building and powering distributed, responsive and intelligent cloud applications across multiple data models"
Tags : 
    
Datastax
Published By: Ounce Labs, an IBM Company     Published Date: Dec 29, 2009
Countless studies and analyst recommendations suggest the value of improving security during the software development life cycle rather than trying to address vulnerabilities in software discovered after widespread adoption and deployment. The justification is clear.For software vendors, costs are incurred both directly and indirectly from security flaws found in their products. Reassigning development resources to create and distribute patches can often cost software vendors millions of dollars, while successful exploits of a single vulnerability have in some cases caused billions of dollars in losses to businesses worldwide. Vendors blamed for vulnerabilities in their product's source code face losses in credibility, brand image, and competitive advantage.
Tags : 
source code vulnerability testing, independent model, centralized model, distributed model, software development life cycle, source code scanning, application security, source code security testing, identity management, policy based management, security management, security policies, application integration, configuration management, data protection
    
Ounce Labs, an IBM Company
Published By: Cisco     Published Date: Jan 05, 2015
The Cisco UCS solution provides all management and configuration services at the centrally located Fabric Interconnects, so you can manage large-scale deployments from a single location. This method lets you consolidate hardware and streamline management. The IBM Flex System solution uses a distributed management model with chassis-level control. This method adds to the complexity to the hardware configuration, which can increase management needs.
Tags : 
datacenter, data management, collaborations, business capabilities
    
Cisco
Published By: Arbor     Published Date: Mar 13, 2014
All enterprises need to have mitigation solutions in place. Information security is vital in the workplace and DDoS has become more complex over time. Determine whether services are the best option for primary protection through this whitepaper.
Tags : 
arbor, idc, distributed denial-of-service, ddos attack ddos solutions, dns reflection, exfiltration of data, security products, the cloud, cloud migration, high-bandwidth attacks, volumetric attacks, deployment, multi-vector threat, ips solutions, deployment models, security, infrastructure, business intelligence, business management, secure content management
    
Arbor
Published By: CA Technologies     Published Date: Jun 01, 2018
Challenge Understanding, managing and containing risk has become a critical factor for many organizations as they plot their hybrid architecture strategy. Access by an expanding array of privileged identities looms large as a risk concern once organizations look beyond tactically using cloud services for cost and agility efficiencies. Existing approaches developed for static infrastructure can address initial risk concerns, but fall short in providing consistent policy enforcement and continuous visibility for dynamic, distributed infrastructure. Opportunity Multiple elements factor into how effectively an enterprise can embrace automation and advance the maturity of their transformation. However, security tools are central to enabling a structured and measured approach to managing critical access risks at each stage of the maturity model journey. With the right privileged access platform and set of tools, enterprises can progressively automate and scale access management to align risk
Tags : 
    
CA Technologies
Published By: Akamai Technologies     Published Date: Jun 27, 2017
Traditional remote access technologies were created twenty-years ago, before businesses were distributed, mobile, and users of cloud. View this slideshare to learn 3 reasons why now is the time for a new remote access model.
Tags : 
    
Akamai Technologies
Published By: Datastax     Published Date: Aug 23, 2017
Part of the “new normal” where data and cloud applications are concerned is the ability to handle multiple types of data models that exist in the application and persist each in a single datastore. This data management capability is called a “multi-model” database. Chances are you are getting bogged down by various data models that require support — key-value, tabular, JSON/document and graph. This not only raises your operational expenses, but also slows down your time to market and ultimately revenue growth. Download this free white paper and explore the multi-model concept, its rationale, and how DataStax Enterprise (DSE) is the only database that can help accelerate building and powering distributed, responsive and intelligent cloud applications across multiple data models.
Tags : 
cloud, data model, multi-model
    
Datastax
Published By: Dassault Systèmes     Published Date: May 09, 2018
Today’s thriving High-Tech sector is driven by shrinking product lifecycles, rapid innovation, distributed engineering/manufacturing—and highly demanding customer expectations. The industry needs to deliver on multiple fronts, including: • Embed customer-centric innovation throughout the lifecycle: Only with customer experience at the core can companies stay ahead. • Tame ideas into executable products: Detecting early trends and using customer feedback is vital. • Manage complexity better: Increasing visibility of all product data helps build and manage digital models to use in every business function from R&D to field service. • Create relevant connected systems: High-Tech innovators use IoT for an ongoing dialogue of customers, devices and manufacturers. • Provide agility to compete on software, hardware and service: Customers want value from every interaction. Download your targeted industry analysis to learn more.
Tags : 
    
Dassault Systèmes
Published By: Vertica     Published Date: Oct 30, 2009
Independent research firm Knowledge Integrity Inc. examine two high performance computing technologies that are transitioning into the mainstream: high performance massively parallel analytical database management systems (ADBMS) and distributed parallel programming paradigms, such as MapReduce, (Hadoop, Pig, and HDFS, etc.). By providing an overview of both concepts and looking at how the two approaches can be used together, they conclude that combining a high performance batch programming and execution model with an high performance analytical database provides significant business benefits for a number of different types of applications.
Tags : 
vertica, analytical computing, adbms, mapreduce, application management, data management, data mining, grid computing, business analytics, business metrics, linux, analytical applications, business intelligence, information management, data warehousing
    
Vertica
Published By: Microsoft Azure     Published Date: Apr 11, 2018
When you extend the global reach of your enterprise, you’ll find new markets for your products and services. That means reaching more potential customers, bigger growth potential, and higher ROI. But to tap into those emerging markets, you need to provide the best, most consistent user experience. Now, it’s possible for you to build, deploy, and manage modern apps at scale with a globally-distributed database—without the hassles associated with hosting in your data center. Read the e-book Build Modern Apps with Big Data at a Global Scale and learn how Azure Cosmos DB, a globally-distributed turnkey database service, is transforming the world of modern data management. Keep access to your data available, consistent, and safe—with industry-leading, enterprise-grade security and compliance. Start developing the best app experience for your users based on five well-defined consistency models: Strong: Favors data consistency. Ideal for banks, e-commerce processing, and online booking. Boun
Tags : 
    
Microsoft Azure
Published By: Microsoft Azure     Published Date: Apr 11, 2018
Developing for and in the cloud has never been more dependent on data. Flexibility, performance, security—your applications need a database architecture that matches the innovation of your ideas. Industry analyst Ovum explored how Azure Cosmos DB is positioned to be the flagship database of internet-based products and services, and concluded that Azure Cosmos DB “is the first to open up [cloud] architecture to data that is not restricted by any specific schema, and it is among the most flexible when it comes to specifying consistency.” From security and fraud detection to consumer and industrial IoT, to personalized e-commerce and social and gaming networks, to smart utilities and advanced analytics, Azure Cosmos DB is how Microsoft is structuring the database for the age of cloud. Read the full report to learn how a globally distributed, multi-model data service can support your business objectives. Fill out the short form above to download the free research paper.
Tags : 
    
Microsoft Azure
Published By: Qlik     Published Date: Jun 24, 2016
This paper outlines the benefits of a distributed model, and includes insights and case studies from leading executives.
Tags : 
qlik, data, business data, business intelligence, distributed model, best practices, business activity monitoring, business analytics, business management, information management, records management
    
Qlik
Published By: Symantec     Published Date: Jul 11, 2017
The technology pendulum is always swinging. And chief information security officers must be prepared to swing with it—or get clocked. A look at recent history illustrates the oscillating nature of technology. In the 1980s, IBM mainframes dominated the landscape. In the ’90s, client-server computing came on the scene and data was distributed on personal computers. When the Web assumed predominance, the pendulum started to swing back to a centralized server. Then, just as quickly, mobile took the lead, with apps downloaded to workers’ devices—the new client server. Now, as mobile devices continue to populate the enterprise at a rapid rate, the IT model is changing again—to the provisioning of information on a just-what’s-needed, just-in-time basis from centralized servers consolidated in the cloud. The pendulum continues to swing and IT workloads are moving to the cloud en masse.
Tags : 
cloud, security, data protection, data loss, information security
    
Symantec
Published By: RagingWire Data Centers     Published Date: Oct 22, 2016
This paper presents RagingWire’s distributed redundancy model which is an enhancement of shared and dedicated infrastructure models.
Tags : 
ragingwire, data center, colocation, power, energy, storage, infrastructure, server hardware, blade servers, storage virtualization, power and cooling, colocation and web hosting
    
RagingWire Data Centers
Published By: Aomega     Published Date: Nov 06, 2006
Recent regulatory additions require that companies take proactive measures like penetration testing to enforce data privacy and integrity.  By deploying a distributed model companies can execute testing from different security levels which is important in challenging posture based on level of access.
Tags : 
regulatory compliance, compliance, data privacy, pci, data privacy, data protection, access control, security testing, security audit, glba, hipaa compliance, aomega, security, network security, auditing, hacker detection, internet security, intrusion detection, intrusion prevention, security management
    
Aomega
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.