Glossary Items

D

  1. DALI is a standardized interface for lighting control. Electronic ballasts for fluorescent lamps, transformers, and sensors of lighting systems communicate with the building automation and control system via DALI.
    Electronic ballasts for fluorescent lamps, transformers and sensors of lighting systems communicate with the building automation and control system via DALI.
  2. DASH7 is an “instant-on,” long-range, low power wireless communications standard for applications requiring modest bandwidth like text messages, sensor readings, or location-based advertising coordinates.
    DASH7 is an open source RFID-standard for wireless sensor networking, which operates in the 433 MHz unlicensed ISM band/SRD band. DASH7 provides multi-year battery life, range of up to 2 km, indoor location with 1 meter accuracy, low latency for connecting with moving things, a very small open source protocol stack, AES 128-bit shared key encryption support, and data transfer of up to 200 kbit/s.
  3. Data aggregation is any process in which information is gathered and expressed in a summary form for purposes such as statistical analysis. A common aggregation purpose is to get more information about particular groups based on specific variables such as age, profession, or income.
    A common aggregation purpose is to get more information about particular groups based on specific variables such as age, profession, or income. For example, a site that sells music CDs might advertise certain CDs based on the age of the user and the data aggregate for their age group. Online analytic processing (OLAP) is a simple type of data aggregation in which the marketer uses an online reporting mechanism to process the information.
  4. Data at rest is an information technology term referring to inactive data that is stored physically in any digital form (e.g., databases, data warehouses, spreadsheets, archives, tapes, off-site backups, mobile devices, etc.).
    Data at rest is a term that is sometimes used to refer to all data in computer storage while excluding data that is traversing a network or temporarily residing in computer memory to be read or updated. Businesses, government agencies, and other institutions are concerned about the ever-present threat posed by hackers to data at rest. In order to keep data at rest from being accessed, stolen, or altered by unauthorized people, security measures such as data encryption and hierarchical password protection are commonly used. For some types of data, specific security measures are mandated by law.
  5. A data broker, also called an information broker or information reseller is a business that collects personal information about consumers and sells that information to other organizations.
    A data broker, also called an information broker or information reseller, is a business that collects personal information about consumers and sells that information to other organizations. Data brokers can collect information about consumers from a variety of public and non-public sources including courthouse records, website cookies and loyalty card programs. Typically, brokers create profiles of individuals for marketing purposes and sell them to businesses who want to target their advertisements and special offers.
  6. Data classification is the process of sorting and categorizing data into various types, forms, or any other distinct class. Data classification enables the separation and classification of data according to data set requirements for various business or personal objectives.
    Data classification is the process of sorting and categorizing data into various types, forms or any other distinct class. Data classification enables the separation and classification of data according to data set requirements for various business or personal objectives. It is mainly a data management process. Data classification is a diverse process that involves various methods and criteria for sorting data within a database or repository. This is generally done through a database or business intelligence software that provides the ability to scan, identify and separate data.
  7. The Data Distribution Service (DDS) for real-time systems is an Object Management Group (OMG) machine-to-machine standard that aims to enable dependable, high-performance, interoperable, real-time, scalable data exchanges using a publish-subscribe pattern.
    The Data Distribution Service for Real-Time Systems (DDS) is an Object Management Group (OMG) machine-to-machine middleware "m2m" standard that aims to enable scalable, real-time, dependable, high-performance and interoperable data exchanges between publishers and subscribers. DDS addresses the needs of applications like financial trading, air-traffic control, smart grid management, and other big data applications. The standard is used in applications such as smartphone operating systems, transportation systems and vehicles, software-defined radio, and by healthcare providers. DDS may also be used in certain implementations of the Internet of Things.
  8. Data Encryption is a process of encoding a message so that it can be read-only by the sender and the intended recipient. Encryption is the most effective way to achieve data security. To read an encrypted file, you must have access to a secret key or password that enables you to decrypt it.
    Encryption is the most effective way to achieve data security. To read an encrypted file, you must have access to a secret key or password that enables you to decrypt it. Unencrypted data is called plain text ; encrypted data is referred to as cipher text. There are two main types of encryption: asymmetric encryption (also called public-key encryption) and symmetric encryption. The primary purpose of encryption is to protect the confidentiality of digital data stored on computer systems or transmitted via the Internet or other computer networks. Modern encryption algorithms play a vital role in the security assurance of IT systems and communications as they can provide not only confidentiality, but also the following key elements of security: Authentication - the origin of a message can be verified. Integrity - proof that the contents of a message have not been changed since it was sent. Non-repudiation - the sender of a message cannot deny sending the message.
  9. Data governance refers to the overall management of the availability, usability, integrity, and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures, and a plan to execute those procedures.
    A sound data governance program includes a governing body or council, a defined set of procedures, and a plan to execute those procedures.
  10. Data integrity, in terms of data and network security, is the assurance that information can only be accessed or modified by those authorized to do so.
    Measures taken to ensure integrity include controlling the physical environment of networked terminals and servers, restricting access to data, and maintaining rigorous authentication practices. Data integrity can also be threatened by environmental hazards, such as heat, dust, and electrical surges. Practices followed to protect data integrity in the physical environment include: making servers accessible only to network administrators, keeping transmission media (such as cables and connectors) covered and protected to ensure that they cannot be tapped, and protecting hardware and storage media from power surges, electrostatic discharges, and magnetism. Network administration measures to ensure data integrity include: maintaining current authorization levels for all users, documenting system administration procedures, parameters, and maintenance activities, and creating disaster recovery plans for occurrences such as power outages, server failure, and virus attacks.
  11. Data Janitor is a subtask of data science concerned with the cleaning up of dirty or duplicative data.A data janitor sifts through data for companies in the information technology industry.
    A less than glamorous subtask of data science concerned with the cleaning up of dirty or duplicative data.
  12. Coined by Pentaho CTO James Dixon, a data lake is a massive data repository, designed to hold raw data until it’s needed and to retain data attributes so as not to preclude any future uses or analysis.
    The data lake is stored on relatively inexpensive hardware, and Hadoop can be used to manage the data, replacing OLAP as a means to answer specific questions. Sometimes referred to as an “enterprise data hub,” the data lake and its retention of native formats sits in contrast to the traditional data warehouse concept.
  13. Generally, data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both.
    For example, one Midwest grocery chain used the data mining capacity of Oracle software to analyze local buying patterns. They discovered that when men bought diapers on Thursdays and Saturdays, they also tended to buy beer. Further analysis showed that these shoppers typically did their weekly grocery shopping on Saturdays. On Thursdays, however, they only bought a few items.
  14. Data normalization is the process of reducing data to its canonical form. Database normalization is the process of organizing the fields and tables of a relational database to minimize redundancy and dependency.
    Database normalization is the process of organizing the fields and tables of a relational database to minimize redundancy and dependency. In creating a database, normalization is the process of organizing it into tables in such a way that the results of using the database are always unambiguous and as intended. Normalization may have the effect of duplicating data within the database and often results in the creation of additional tables. (While normalization tends to increase the duplication of data, it does not introduce redundancy, which is unnecessary duplication.) Normalization is typically a refinement process after the initial exercise of identifying the data objects that should be in the database, identifying their relationships, and defining the tables required and the columns within each table.
  15. Right of individuals to control or influence what information related to them may be collected and stored and by whom and to whom that information may be disclosed.
    The right not to be subjected to unsanctioned invasion of privacy by the government, corporations or individuals is part of many countries' privacy laws, and in some cases, constitutions. Almost all countries have laws which in some way limit privacy. An example of this would be law concerning taxation, which normally require the sharing of information about personal income or earnings. In some countries individual privacy may conflict with freedom of speech laws and some laws may require public disclosure of information which would be considered private in other countries and cultures. Privacy may be voluntarily sacrificed, normally in exchange for perceived benefits and very often with specific dangers and losses, although this is a very strategic view of human relationships. Research shows that people are more willing to voluntarily sacrifice privacy if the data gatherer is seen to be transparent as to what information is gathered and how it is used. In the business world, a person may volunteer personal details (often for advertising purposes) in order to gamble on winning a prize. A person may also disclose personal information as part of being an executive for a publicly traded company in the USA pursuant to federal securities law. Personal information which is voluntarily shared but subsequently stolen or misused can lead to identity theft. The concept of universal individual privacy is a modern construct primarily associated with Western culture, British and North American in particular, and remained virtually unknown in some cultures until recent times. According to some researchers, this concept sets Anglo-American culture apart even from Western European cultures such as French or Italian. Most cultures, however, recognize the ability of individuals to withhold certain parts of their personal information from wider society - a figleaf over the genitals being an ancient example.
  16. Data science utilizes data preparation, statistics, predictive modeling, and machine learning to investigate problems in various domains such as agriculture, marketing optimization, fraud detection, risk management, marketing analytics, public policy, etc.
    Data science utilizes data preparation, statistics, predictive modeling and machine learning to investigate problems in various domains such as agriculture, marketing optimization, fraud detection, risk management, marketing analytics, public policy, etc. It emphasizes the use of general methods such as machine learning that apply without changes to multiple domains. This approach differs from traditional statistics with its emphasis on domain-specific knowledge and solutions.
  17. In connection-oriented communication, a data stream is a sequence of digitally encoded coherent signals (packets of data or data packets) used to transmit or receive information that is in the process of being transmitted.
    A stream of digital information stemming from a physical or virtual sensor. Data streaming is the process of transferring a stream of data from one place to another, to a sender and recipient or through some network trajectory. Data streaming is applied in multiple ways with various protocols and tools that help provide security, efficient delivery and other data results. Data streaming methods are central to technologies like the Internet, 3G and 4G wireless systems for mobile devices, as well as data handling for business processes in corporate networks. Administrators typically employ precise methods and processes to monitor data streaming and ensure effectiveness and maximum security.
  18. Data ecosystems are complex and littered with data silos, limiting the value that organizations can get out of their own data by making it difficult to access.
    To unlock the value of raw data, companies must start treating data more as a supply chain, enabling the data to flow easily and usefully through the entire organization―and eventually throughout the organization’s ecosystem of partners as well.
  19. The speed with which data is generated and processed. Matching the speed of decision to the speed of action - Business leaders have been bombarded with statistics about the soaring volume of data that they can mine for valuable insights.
    Matching the speed of decision to the speed of action - Business leaders have been bombarded with statistics about the soaring volume of data that they can mine for valuable insights.
  20. Data-Driven Decision Management (DDDM) is an approach to business governance valuing decisions that can be backed up with verifiable data. The data-driven approach is gaining popularity within the enterprise as the amount of available data increases in tandem with market pressures.
    The data-driven approach is gaining popularity within the enterprise as the amount of available data increases in tandem with market pressures. Data-driven decision management is usually undertaken as a means of gaining a competitive advantage. A study from the MIT Center for Digital Business found that organizations driven most by data-based decision making had 4% higher productivity rates and 6% higher profits. The success of the data-driven approach is reliant upon the quality of the data gathered and the effectiveness of its analysis and interpretation.
  21. In information technology, the Datagram Transport Layer Security (DTLS) communications protocol provides communications security for datagram protocols. DTLS allows datagram-based applications to communicate in a way that is designed to prevent eavesdropping, tampering, or message forgery.
    DTLS allows datagram-based applications to communicate in a way that is designed to prevent eavesdropping, tampering, or message forgery. The DTLS protocol is based on the stream-oriented Transport Layer Security (TLS) protocol and is intended to provide similar security guarantees. The DTLS protocol datagram preserves the semantics of the underlying transport ― the application does not suffer from the delays associated with stream protocols, but has to deal with packet reordering, loss of datagram and data larger than the size of a datagram network packet.
  22. A term coined by Marc Blackmer, datakinesis occurs when an action taken in cyberspace has a result in the physical world. Industrial Control Systems, for example, are vulnerable to datakinetic attacks where physical equipment such as valves and sensors are compromised and damaged by hackers.
    Industrial Control Systems, for example, are vulnerable to datakinetic attacks where physical equipment such as valves and sensors are compromised and damaged by hackers. Stuxnet is one such example.
  23. DATEX II is a standard developed for information exchange between traffic management centres developed in line with the ITS (Intelligent Transport Systems) Action Plan.
    DATEX II has been developed to provide a standardised way of communicating and exchanging traffic information between traffic centres, service providers, traffic operators and media partners. The specification provides for a harmonised way of exchanging data across boundaries, at a system level, to enable better management of the European road network.
  24. De-identification is the process used to prevent someone's personal identity from being revealed. The process must include the removal of both direct identifiers (name, email address, etc.) and the proper handling of quasi-identifiers (sex, marital status, profession, postal code, etc.).
    The process must include the removal of both direct identifiers (name, email address, etc.) and the proper handling of quasi-identifiers (sex, marital status, profession, postal code, etc.).
  25. A decentralized application operates autonomously and stores its data on a Blockchain. The back-end code for a Dapp runs on a peer-to-peer network rather than on a centralized server.
    The back-end code for a Dapp runs on a peer-to-peer network rather than on a centralized server. Dapps are incentivized by the use of cryptographic tokens to create an independent ecosystem.
  26. Distributed Artificial Intelligence is a subfield of artificial intelligence research dedicated to the development of distributed solutions for problems. It is closely related to and a predecessor of the field of multi-agent systems.
    Decentrality makes for greater flexibility and quicker decisions. Intelligence evolves in the swarm or through joint networking with the cloud.
  27. A Decentralized Autonomous Organization (DAO) is an organization that runs without human intervention. Governance is strictly restricted to pre-coded rules.
    This type of organization is run through rules encoded called smart contracts. A DAO’s transaction record and program rules are maintained on a Blockchain, which presents clear advantages in terms of transparency and security. The legal framework is yet to be unclear however.
  28. Dedicated Short-Range Communication (DSRC) is a wireless communication technology designed to allow automobiles in the intelligent transportation system (ITS) to communicate with other automobiles or infrastructure technology.
    A wireless technology for vehicular traffic. Using a modified 802.11, a technology for North American cars and trucks, DSRC is designed for several applications. For example, ambulances can cause traffic lights down the road to change in their favor, and traffic congestion can be transmitted to automobile navigation systems. It allows vehicles to sense that they are about to crash, and the safety systems can begin to tighten seatbelts and warm up the airbags before impact. In addition, a standard for wireless payment allows parking lots and fast-food drive-ins to offer the same convenience as the automated highway toll systems such as E-ZPass.As the most developed communication technology for V2X, Dedicated Short-Range Communication (DSRC) came into being in the 1990s. In the past 20 years, various governments, standard institutions and OEMs have invested heavily in DSRC technology development and application. Up til now, the US, European Union and Japan have all developed separate standards of DSRC.
  29. An engineering concept used in MEMS that describes the directions in which an object can move and generally the number of independent variables in a dynamic system.
    In mechanics, the degree of freedom of a mechanical system is the number of independent parameters that define its configuration. It is the number of parameters that determine the state of a physical system and is important to the analysis of systems of bodies in mechanical engineering, aeronautical engineering, robotics, and structural engineering.
  30. Delegated Proof of Stake (DPoS) is a consensus algorithm developed to secure a blockchain by ensuring representation of transactions within it. It is an implementation of technology-based democracy, using voting and election process to protect the blockchain from centralization and malicious usage.
    Holders of the cryptocurrency of that particular Blockchain have the chance to vote for a delegate they trust with validating the transactions on that Blockchain. The voting process is continuous therefore delegates are incentivized to keep a high standard of their work at all times and not cheat.
  31. A denial of service attack is an incident in which a user or organization is deprived of the services of a resource they would normally expect to have. In a distributed denial-of-service, large numbers of compromised systems (sometimes called a botnet) attack a single target.
    A denial of service attack is an incident in which a user or organization is deprived of the services of a resource they would normally expect to have. In a distributed denial-of-service, large numbers of compromised systems (sometimes called a botnet) attack a single target. The most common kind of DoS attack is simply to send more traffic to a network address than the programmers who planned its data buffers anticipated someone might send. The attacker may be aware that the target system has a weakness that can be exploited or the attacker may simply try the attack in case it might work.
  32. A device attack is an exploit that takes advantage of a vulnerable device to gain access to a network. The term "device attack" was coined to differentiate such exploits from those targeting personal computers.
    A device attack is an exploit that takes advantage of a vulnerable device to gain access to a network. The term "device attack" was coined to differentiate such exploits from those targeting personal computers. The attack vector could be any other kind of Internet-connected device. Potential targets include not just smartphones, which are the most commonly cited example, but also network hardware, smart grid components, medical equipment and embedded systems among a great many other possibilities. Securing non-PC devices is problematic for a number of reasons. For one thing, many security measures, such as virus scanning, that are suitable for a PC, place too great a demand on the limited resources of smaller devices for memory, processor cycles and electrical power. Administration of patches and updates can be difficult because of sporadic connectivity to the corporate network.
  33. An IoT device management platform. Device Cloud Networks deliver a platform for M2M enablement, focusing on addressing the needs that have historically challenged the growth, penetration, and business opportunities by giving the ability to connect devices simply and easily.
    DCN delivers a platform for M2M enablement focusing on addressing the needs that have historically challenged the growth, penetration, and business opportunities by giving the ability to connect devices simply and easily.
  34. A device driver is a program that controls a particular type of device that is attached to your computer. When buying an operating system, many device drivers are built into the product.
    There are device drivers for printers, displays, CD-ROM readers, diskette drives, and so on. When buying an operating system, many device drivers are built into the product.
  35. The term provisioning for a device means to evolve a device to a state in which it can be handed off to an end-user, or end-user team, for their specific use in a functional manner.
    In general, provisioning means providing or making something available. The term is used in a variety of contexts in IT. For example, in grid computing, to provision is to activate a grid component, such as a server, array, or switch, so that it is available for use. In a storage area network (SAN), storage provisioning is the process of assigning storage to optimize performance. In telecommunications terminology, provisioning means providing a product or service, such as wiring or bandwidth.
  36. An open source JavaScript library for IoT devices developed by deviceJS. deviceJS is like jQuery for IoT. It lets you build node.js/io.js web applications while selecting, controlling and listening to a large variety of smart devices.
    deviceJS is like jQuery for IoT. It lets you build node.js/io.js web applications while selecting, controlling and listening to a large variety of smart devices. deviceJS can autodetect many devices, and provides hooks for event changes and device discovery. It has protocol and schema support for many IoT frameworks. Scripts/applications written for DeviceJS can execute and consequently control devices in many locations.
  37. DevOps is the combination of cultural philosophies, practices, and tools that automates the processes to deliver application between the IT and software development team.
  38. This format is used to store computer data on audiotape. Digital Data Storage is a format for storing and backing up computer data on tape that evolved from the Digital Audio Tape (DAT) technology.
    Digital Data Storage (DDS) is a format for storing and backing up computer data on tape that evolved from the Digital Audio Tape (DAT) technology. DAT was created for CD-quality audio recording. In 1989, Sony and Hewlett Packard defined the DDS format for data storage using DAT tape cartridges. Tapes conforming to the DDS format can be played by either DAT or DDS tape drives. However, DDS tape drives cannot play DAT tapes since they can't pick up the audio on the DAT tape. DDS uses a 4-mm tape. A DDS tape drive uses helical scanning for recording, the same process used by a video recorder (VCR). There are two read heads and two write heads. The read heads verify the data that has been written (recorded).
  39. Digital disruption is the change that occurs when new digital technologies and business models affect the value proposition of existing goods and services.
    The rapid increase in the use of of mobile devices for personal use and work, a shift sometimes referred to as the consumerization of IT, has increased the potential for digital disruption across many industries. A powerful example is the way Amazon, Netflix and Hulu Plus have disrupted the media and entertainment industries by changing how content is accessed by customers and monetized by advertisers. For example the CBS, NBC and ABC networks in the United States still receive income from broadcasting television shows, but they can't charge as much for advertising as they could when there were only three networks and all viewers used television sets to consume content. Since 2000, 52 percent of the companies in the Fortune 500 have gone bankrupt, have been acquired, or have ceased to exist, due in large part to the disruption of traditional industry models by digital models.
  40. Digital Enhanced Cordless Telecommunications (DECT) is used primarily in the home and small office systems but is also available in many private branch exchange (PBX) systems for medium and large businesses.
    DECT is used primarily in home and small office systems, but is also available in many private branch exchange (PBX) systems for medium and large businesses. DECT can also be used for purposes other than cordless phones. Voice applications, such as baby monitors, are becoming common. Data applications also exist, but have been eclipsed by Wi-Fi. 3G and 4G cellular also competes with both DECT and Wi-Fi for both voice and data. DECT is also used in special applications, such as remote controls for industrial applications.
  41. A digital footprint is a trail of data you create while using the Internet. It includes the websites you visit, emails you send, and the information you submit to online services.
    Tony Fish expounded upon the possible dangers of digital footprints in a 2007 self-published book. The closed loop takes data from the open loop and provides this as a new data input. This new data determines what the user has reacted to, or how they have been influenced. The feedback then builds a digital footprint based on social data, and the controller of the social digital footprint data can determine how and why people purchase and behave.
  42. Digital manufacturing focuses on reducing the time and cost of manufacturing by integrating and using data from design, production, and product use; digitizing manufacturing operations to improve the product, process, and enterprise performance, and tools for modeling and advanced analytics.
    Digital manufacturing focuses on reducing the time and cost of manufacturing by integrating and using data from design, production, and product use; digitizing manufacturing operations to improve product, process, and enterprise performance, and tools for modeling and advanced analytics, throughout the product life cycle.
  43. The Digital signal controller is a hybrid of microcontrollers and digital signal processors (DSPs). DSCs have fast interrupt responses, offer control-oriented peripherals like PWMs and watchdog timers.
  44. A digital signal processor is a specialized microprocessor (or a SIP block), with its architecture optimized for the operational needs of digital signal processing.
    Digital signal processing (DSP) is the numerical manipulation of signals, usually with the intention to measure, filter, produce or compress continuous analog signals. It is characterized by the use of digital signals to represent these signals as discrete time, discrete frequency, or other discrete domain signals in the form of a sequence of numbers or symbols to permit the digital processing of these signals. The increasing use of computers has resulted in the increased use of, and need for, digital signal processing.
  45. The signature is used to verify that the person who sends a particular transaction is, in fact, the owner of that wallet. It guarantees the contents of a message have not been altered in transit.
    The signature is used to verify that the person who sends a particular transaction is in fact the owner of that wallet. This is done by including a message with the transaction so that the recipient can verify using the sendes public key. This way, the recipient can be sure that only the sender could have sent this message.
  46. The digital supply chain merges the major business processes of all parties involved from the suppliers to the manufacturer and the end customer. The main value includes the acceleration of the production and logistics processes, and the reduction of data acquisition and the optimization.
    The potential of a digitized value creation chain lies primarily in the acceleration of the production and logistics processes, the reduction of effort for data acquisition and the optimization of data security and consistency. With integrated networking, the digital value creation chain is able to overcome current media discontinuity. One example from the field of procurement: where a steel-processing company previously had to activate a complicated process via different media for purchasing and replenishment, in the future purchasing will be automated on the basis of predefined parameters. Companies today are already making use of digital value creation chains to optimize individual production islands and processes within their organization.
  47. Digital transformation is the integration of digital technology into all areas of a business, fundamentally changing how you operate and deliver value to customers. It's also a cultural change that requires organizations to continually challenge the status quo, and get comfortable with failure.
  48. Digitization is the process of moving information onto a format that can be understood by a computer for that data to be used in computational calculations.
    Digitization is of crucial importance to data processing, storage and transmission, because it allows information of all kinds in all formats to be carried with the same efficiency and also intermingled.&Unlike analog data, which typically suffers some loss of quality each time it is copied or transmitted, digital data can, in theory, be propagated indefinitely with absolutely no degradation. This is why it is a favored way of preserving information for many organisations around the world.
  49. Diminished reality is a term used to describe the control over one's reality and the ability to block out real or digital information at will. Diminishes parts of the physical world by removing unwanted objects from view.
    Diminishes parts of the physical world by removing unwanted objects from view. It is the opposite of augmented reality. AR enhances our physical reality with digital assets, while diminished reality digitally removes physical objects from our view.
  50. A directed acyclic graph (DAG) is a directed graph data structure that uses a topological ordering. The sequence can only go from earlier to later. Its main application is an alternative to the Blockchain protocol, provided by IOTA and marketed as the Tangle.
    DAG is often applied to problems related to data processing, scheduling, finding the best route in navigation, and data compression. In DLT, DAG can be used for applications that require scalability in the thousands of transactions per second as it does not require miners, it is feeless, and transactions are confirmed much quicker than Blockchain.
  51. Distributed cloud means the distribution of public cloud services to locations outside the cloud provider’s physical data centers, but which are still controlled by the provider.
  52. A distributed denial-of-service attack is one in which a multitude of compromised systems attack a single target, thereby causing a denial of service for users of the targeted system.
    DDoS is a type of DOS attack where multiple compromised systems, which are often infected with a Trojan, are used to target a single system causing a Denial of Service (DoS) attack. Victims of a DDoS attack consist of both the end targeted system and all systems maliciously used and controlled by the hacker in the distributed attack. The DoS attack typically uses one computer and one Internet connection to flood a targeted system or resource. The DDoS attack uses multiple computers and Internet connections to flood the targeted resource. DDoS attacks are often global attacks, distributed via botnets.
  53. In computer science, distributed memory refers to a multiprocessor computer system in which each processor has its own private memory. In contrast, a shared memory multiprocessor offers a single memory space used by all processors.
    Computational tasks can only operate on local data, and if remote data is required, the computational task must communicate with one or more remote processors. In contrast, a shared memory multiprocessor offers a single memory space used by all processors. Processors do not have to be aware where data resides, except that there may be performance penalties, and that race conditions are to be avoided.
  54. In computer programming, distributed revision control, also known as distributed version control or decentralized version control, allows many software developers to work on a given project without requiring them to share a common network.
    Distributed revision control takes a peer-to-peer approach to version control, as opposed to the client-server approach of centralized systems. Rather than a single, central repository on which clients synchronize, each peer's working copy of the codebase is a complete repository.
  55. Docker is an open source software platform that creates, deploys and manages virtualized application containers on a common operating system (OS) through a related tool ecosystem.
  56. A domain model is a system of abstractions that describes selected aspects of a sphere of knowledge, influence, or activity. The model can then be used to solve problems related to that domain.
    A domain model is a system of abstractions that describes selected aspects of a sphere of knowledge, influence, or activity. The model can then be used to solve problems related to that domain. The domain model is a representation of meaningful real-world concepts pertinent to the domain that need to be modeled in software. The concepts include the data involved in the business and rules the business uses in relation to that data. A domain model generally uses the vocabulary of the domain so that a representation of the model can be used to communicate with non-technical stakeholders.
  57. The Domain Name System (DNS) is a hierarchical distributed naming system for computers, services, or any resource connected to the Internet or a private network.
    Most prominently, it translates domain names, which can be easily memorized by humans, to the numerical IP addresses needed for the purpose of computer services and devices worldwide. The Domain Name System is an essential component of the functionality of most Internet services because it is the Internet's primary directory service.
  58. A combination of domestic and robotics - also a composite of the Latin domus and informatics, domotics includes home automation systems, home robots, whole house audio/visual systems, and security systems.
    The field of domotics encompasses all phases of smart home technology, including the highly sophisticated sensors and controls that monitor and automate temperature, lighting, security systems, and many other functions.
  59. A double spend is an attack where the given set of coins is spent in more than one transaction. The risk that a digital currency can be spent twice. A 51% attack is one such attack.
    A double spend is an attack where the given set of coins is spent in more than one transaction. A 51% attack is one such attack.
  60. Downlink is the process of downloading data onto an end node from a server/target address. In a cellular network, this would be seen as data being sent from a cellular base station to a mobile handset.
    In a cellular network this would be seen as data being sent from a cellular base station to a mobile handset.
test test