data model

Results 201 - 225 of 368Sort Results By: Published Date | Title | Company Name
Published By: Pure Storage     Published Date: Feb 02, 2017
Read on to find out how purpose-built all-flash storage is an exceptional catalyst for improving data center operations and supporting the transition to the cloud operating model.
Tags : 
data centre, storage, flash storage, cloud, cloud computing
    
Pure Storage
Published By: HP     Published Date: Jul 29, 2008
This white paper describes an energy audit tool which can lead to significant decreases in the costs of running a data center.  Thermal Zone Mapping (TZM) is a visualization tool developed to present in graphical format high level thermal metrics developed by HP and targeted for use as part of the HP Data Center Thermal Assessment service. This tool uses data generated from computer models of the data center and generates metrics, which are then post-processed and visualized in the three-dimensional data center space.
Tags : 
heat, data center, heat monitoring, hardware, datacenter, mission critical, monitoring, network management, servers, green computing, power and cooling
    
HP
Published By: VeriSign Incorp.     Published Date: May 08, 2009
Web Application Threats Are Evolving. Are Your Security Efforts Keeping Pace? Today, Web application security threats are not only becoming more abundant than ever, but also more difficult to detect, and more complex to solve. Many organizations are responding to these unique vulnerabilities with traditional network security approaches. However, sophisticated Web applications threats require a more sophisticated security strategy. What’s worked in the past won’t necessarily work today; and what’s more, Web application security requires a comprehensive solution, not simply a series of a la carte provisions. For detailed steps toward improving your Web application security strategy, download the VeriSign® Enterprise Security Services white paper, Best Practices That Improve Web Application Security.
Tags : 
verisign, web application security, sensitive data, intellectual property, business processes, operational costs, verisign enterprise security services, point-of-sale, pos, application-layer vulnerabilities, web 2.0, virtual servers, service oriented architecture (soa), lightweight technologies, insider threat, holistic control model, software development lifecycle, sdlc, wafs, international computing
    
VeriSign Incorp.
Published By: Burton Group     Published Date: Jul 07, 2008
Data modeling has evolved from an arcane technique for database designers into an entire family of interrelated techniques that serves many constituencies, including techno-phobic business stakeholders and users. The new maturity of modeling tools and techniques arrives in the nick of time, because new technical and regulatory realities demand that enterprises maintain scrupulous awareness of their data and how it is used. Data modeling is no longer for databases only, no longer for technologists only, and no longer optional.
Tags : 
modeling, data modeling, data management, data mining, business intelligence, dba, database administration, burton group, business analytics
    
Burton Group
Published By: HP Data Center     Published Date: Feb 18, 2009
Today's data centers are embarking down a path in which "old world" business, technology, and facility metrics are being pushed aside in order to provide unparalleled service delivery capabilities, processes, and methodologies. The expectations derived from today’s high-density technology deployments are driving service delivery models to extremes with very high service delivery capabilities adopted as baseline requirements within today’s stringent business models. Part of the "revolution" that is driving today's data center modeling to unprecedented high performance and efficiency levels is the fact that computer processing advances with regard to high-performance and smaller footprints have truly countered each other.
Tags : 
hp data center, data center enfironment, high density computing, rack-mount servers, mep mechanical, electrical, and plumbing, virtualization, consolidation, it deployments, server consolidation, networking, storage, storage virtualization, server virtualization, data center design and management
    
HP Data Center
Published By: Citrix Systems     Published Date: Nov 10, 2014
Measurable performance is a key factor when selecting an Application Delivery Controller(ADC) solution for modern data centers. In this report, Tolly evaluates the performance of several Citrix NetScaler ADC models vs. that of comparable products. Learn how NetScaler fared and provided up to 480% the performance of F5.
Tags : 
dos, denial of service, netscaler, defense, network, low-bandwith, attacks, transactions, targeting, companies, prevention measures, firewall, intruder, security, data management
    
Citrix Systems
Published By: TruSignal     Published Date: Jun 03, 2013
This white paper aims to provide B2C digital marketers with a better understanding of why you may need an audience expansion technique and what questions to ask yourself before you get started. We hope to not only build an imperative for audience expansion techniques, but also to offer a guide that will help you choose the right data and right techniques for reaching more of your desired prospects online. Specifically, this white paper will discuss and differentiate two specific expansion approaches: lookalike and act-alike audiences including how they are built, the problems they solve and how to use them effectively throughout the marketing funnel.
Tags : 
audience expansion, lookalike, act-alike, audience targeting, predictive analytics, big data, profile data, behavioral data, third-party data, first-party data, digital marketing, single factor correlation, multiple factor correlation, predictive audience models, custom audiences, offline data, effective marketing practices, digital marketers, marketing funnel, trusignal
    
TruSignal
Published By: Compass Datacenters     Published Date: Aug 19, 2014
This white paper will provide a real example of how Compass Datacenters used this modeling approach to design, commission and calibrate its Shakopee, MN data center.
Tags : 
compass datacenters, calibrated data center, modeling approach, operational performance, it management, knowledge management, data center
    
Compass Datacenters
Published By: Cisco     Published Date: May 14, 2015
Cisco's Virtualized Multi-tenant Data Center (VMDC) system defines an end-to-end architecture, which an organization may reference for the migration or build out of virtualized, multi-tenant data centers for new cloud-based service models such as Infrastructure as a Service (IaaS).
Tags : 
cloud computing, erp, productivity, applications, efficiency, virtualization, cisco, data center, data warehousing, design and facilities
    
Cisco
Published By: Oracle     Published Date: Apr 16, 2018
Lançado no Oracle Open World 2017, o Oracle Autonomous Database Cloud utiliza um revolucionário modelo de machine learning para habilitar a automação que elimina erros humanos e ajustes manuais, resultando em alto desempenho, disponibilidade de armazenamento e segurança a um custo muito mais baixo. Saiba como ele funciona e por que adotá-lo!
Tags : 
proxima, geracao, banco, dados, lider, setor, acaba
    
Oracle
Published By: Oracle     Published Date: Apr 16, 2018
Oracle Autonomus Database Cloud, lanzada en el Oracle Open World 2017, usa un revolucionario modelo de machine learning que elimina el error humano y los ajustes manuales, para habilitar un alto rendimiento, seguridad y disponibilidad sin precedentes a un costo mucho más bajo. ¡Aprende cómo funciona y por qué adoptarlo!
Tags : 
esta, aqui, nueva, generacion, bases, datos, lider, industria
    
Oracle
Published By: Oracle     Published Date: Apr 16, 2018
O gerenciamento de banco de dados é caro e complicado. Conforme aumenta o número de aplicativos e bancos de dados, os custos e complicações podem multiplicar-se. Uma solução, seria um sistema de hardware e software projetado especificamente para que o software de banco de dados otimize as operações de banco de dados, tanto para o desempenho quanto para a simplificação administrativa. O Oracle Exadata é a única plataforma que proporciona desempenho ideal do banco de dados e eficiência para dados mistos, análises e cargas de trabalho OLTP. Com uma gama completa de opções de implantação, ele permite que você execute seu banco de dados Oracle e cargas de trabalho de dados onde e como desejar — on-premise, na nuvem da Oracle, na nuvem do Cliente em seu data center ou em qualquer combinação desses modelos.
Tags : 
executar, banco, dados, oracle, exadata
    
Oracle
Published By: Oracle     Published Date: Apr 16, 2018
La gestión de bases de datos resulta costosa y complicada. A medida que aumenta la cantidad de aplicaciones y de bases de datos, se pueden multiplicar los costos y las complicaciones. Una solución sería un sistema hardware y software diseñado específicamente para que el software de la base de datos optimice las operaciones, tanto para simplificar el rendimiento como el aspecto administrativo. Exadata de Oracle es la única plataforma que ofrece un rendimiento óptimo de la base de datos y eficacia para la combinación de datos, análisis y cargas de trabajo para el procesamiento de transacciones en línea (OLTP). Con una amplia variedad de opciones de implementación, puede ejecutar sus bases de datos y cargas de trabajo de datos de Oracle en el lugar que quiera y de la manera que quiera, en la Nube de Oracle, en Cloud at Customer, en su data center o cualquier combinación de estos modelos.
Tags : 
ejecutar, base, datos, oracle, exadata
    
Oracle
    
Ciena
Published By: Datastax     Published Date: Apr 04, 2017
Graph databases are changing how we use data. But first, an example – you're (probably) a human working on a project and looking at graph databases as a potential solution. While we're a company that has a graph database that hopefully solves your problem. Now we could store that data in a boring relational database, but how do we do more than that? For instance, using that data, when combined with other data points, to find other people like you and recommend our solution to them? This is where a graph can come in handy. The friendly graph data model makes it easy to use patterns of relationships within large data sets. By leveraging those relationships we can analyze, or create better real-time experiences. Why Graph explores why this graph database 'thing' is really a thing, how they compare to other database systems, and the use cases they best support.
Tags : 
graph, database, datastax
    
Datastax
Published By: Datastax     Published Date: Aug 03, 2018
"Part of the “new normal” where data and cloud applications are concerned is the ability to handle multiple types of data models that exist in the application and persist each in a single datastore. This data management capability is called a “multi-model” database. Download this free white paper and explore the multi-model concept, its rationale, and how DataStax Enterprise (DSE) is the only database that can help accelerate building and powering distributed, responsive and intelligent cloud applications across multiple data models"
Tags : 
    
Datastax
Published By: FICO     Published Date: Mar 02, 2018
The role of analytics in managing, improving and ultimately transforming supply chains cannot be understated. But what about the analytics themselves? FICO’s Zahir Balaporia and renowned author Tom Davenport use the term “The Analytics Supply Chain” to reflect that the actual analytics themselves parallel supply chains, with inherent challenges and problems if things “get stuck.” Rethinking analytics in these terms can not only improve supply chain performance, but also any other business problems you seek to solve. This article targets: · Steps in the analytics supply chain and the vital role of data and analytic models · How your predictions, recommendations and insights need to rely on similar attributes to finished manufactured products; · Key questions to ask yourself in determining where you need to fix your analytics supply chain.
Tags : 
supply, chain, analytics, employee, optimization, organisations, productivity
    
FICO
Published By: Red Hat     Published Date: Mar 28, 2019
Technology has fundamentally changed the way we live. Access to data and information anytime, anywhere is no longer a luxury—it is a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver higher-quality applications more often, enabling companies to stay relevant and seize digital business opportunities. Cloud-native is an approach to building applications that takes advantage of cloud computing models and DevOps principles to make the delivery of new features and services faster and more flexible. With a cloud-native strategy, organizations can begin the culture, process, and technology changes needed to meet new demands and become an IT organization that can deliver business innovation faster.
Tags : 
    
Red Hat
Published By: Cisco and NVIDIA Corporation     Published Date: Feb 26, 2018
These are interesting and challenging times to be in business. Digital forces, such as the cloud, big data, mobility, and mobile apps are disrupting tried-and-true business models, as well as entire industries. These transformational advances are producing complexity, while workers are demanding simplicity. They require centralized infrastructures to keep information safe, even as agility and flexibility rule the day. Read this eBook to learn more about powerful forces of digital business transformation!
Tags : 
    
Cisco and NVIDIA Corporation
Published By: Anaplan     Published Date: Mar 29, 2018
Incentive compensation represents the potential of delivering optimal sales results. But with up to 60% of sales reps’ income coming from incentive comp, it is crucial to get this right. Our study data has shown that ineffective compensation structures can lead to disengaged reps, high turnover, money left on the table, and low margins. The way we have designed and managed incentive compensation plans in the past may inhibit the sales force and prevent the business from scaling at the needed rate. Modeling and planning quickly become too complex for a spreadsheet-driven exercise.
Tags : 
optimization, compensation, structures, anaplan, data
    
Anaplan
Published By: Lenovo and Intel     Published Date: Jul 10, 2018
The risk of cyber attacks and the cost of failure are increasing. Penalties associated with failure are only going to increase. And these challenges are compounded by increasing collaboration, use of cloud-based solutions, and an increasingly mobile workforce. Join experts from Forrester, Lenovo and Microsoft as they explore Forrester’s newly updated Zero Trust Extended framework. Then Lenovo covers how the four core components of its Data, Identity, Online and Device (DIODe) approach can minimize risk to data and critical IT. By taking these important steps toward a full Zero Trust Extended framework, you can: Protect the business from advanced threats Reduce the impact of breaches Easily support new business and operating models Rise to the challenge of evolving regulations like FISMA, HIPPA, PCI and GDPR
Tags : 
    
Lenovo and Intel
Published By: F5 Networks Inc     Published Date: Jun 24, 2015
Enterprises are moving to a software-defined, private cloud data center model for agility, operational efficiency, and a self-service approach to deploying applications and associated services. They are utilizing a two-tier hybrid services architecture to get the benefits of specialized hardware for front door network services and scalable software for application, stack-specific services. Read this whitepaper to learn how to integrate the necessary services with the orchestration and automation systems of a software-defined data center.
Tags : 
appplication services, cloud, software, load balancing, iot, agility, automation, operational efficiency, networking, software development, it management, knowledge management, enterprise applications
    
F5 Networks Inc
Published By: Logrhythm     Published Date: Feb 24, 2016
The time has come for CEOs and Boards to take personal responsibility for improving their companies’ cyber security. Global payment systems, private customer data, critical control systems, and core intellectual property are all at risk today. As cyber criminals step up their game, government regulators get more involved, litigators and courts wade in deeper, and the public learns more about cyber risks, corporate leaders will have to step up accordingly. This whitepaper focuses on the LogRhythm Security Intelligence Maturity Model, and how it is a valuable guide for building the necessary successive layers of threat detection and response capabilities. Download this paper now to find out more.
Tags : 
cyber attack, risk, cyber security, cyber criminals, cyber risks, security intelligence, threat detection, access control, anti spam, anti spyware, anti virus, application security, authentication, compliance, identity management, intrusion detection, intrusion prevention
    
Logrhythm
Published By: McAfee EMEA     Published Date: Nov 15, 2017
Private cloud is one of the critical deployment architectures IT teams are adopting as they transition to a service-centric delivery model. More than 75% of organizations? already use private clouds to lower costs, increase agility and exert greater control over security, data protection and compliance. The transition to private cloud represents a paradigm shift in how IT is provisioned and data centers are deployed. Virtualization is expanding beyond servers into storage and networking, while software-defined models allow new levels of agility through advanced automation and orchestration. https://www.eventbrite.com/e/mpower-emea-cybersecurity-summit-2017-tickets-36893006977?aff=kp4
Tags : 
cloud, security, guide, privacy, architectures, organizations, compliance, protection
    
McAfee EMEA
Published By: McAfee EMEA     Published Date: Nov 15, 2017
Machine learning offers the depth, creative problem-solving capabilities, and automation to help security organizations gain significant ground against attackers. It’s a powerful tool for processing massive amounts of data for the purpose of malware classification and analysis, especially for unknown threats. Through supervised learning, human researchers can continually develop new training models that expand the understanding and competency of machine learning systems.
Tags : 
analytics, security, problem solving, creative, data, researching, malware
    
McAfee EMEA
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search Research Library      

Add Research

Get your company's research in the hands of targeted business professionals.