An externality is an economic concept that refers to a situation where the actions of individuals or businesses have an impact on third parties who are not directly involved in the transaction. Externalities can be either positive or negative. 1. **Negative Externality**: This occurs when the actions of an individual or company result in harmful effects on others. For example, pollution from a factory can adversely affect the health of people living nearby or the quality of natural resources.
The Price of Anarchy (PoA) is a concept in game theory that measures the efficiency of equilibria in non-cooperative games, particularly in the context of congestion games. In congestion games, players compete for limited resources, and their shared interactions can lead to suboptimal outcomes for the group as a whole.
The Price of Stability (PoS) is a concept in game theory and algorithmic social choice that measures the efficiency of equilibria in games, particularly in the context of strategic interactions among multiple agents or players. Specifically, it quantifies how much the performance of the best Nash equilibrium (a stable state where no player has anything to gain by changing only their own strategy) deviates from the optimal outcome that could be achieved with cooperation.
Implicature is a concept from pragmatics, a subfield of linguistics that studies how context influences the interpretation of meaning in communication. Specifically, implicature refers to information that is suggested or implied by a speaker but not explicitly stated in their utterance. This involves understanding what is meant beyond the literal meaning of words.
Viète's formulas provide relationships between the coefficients of a polynomial and sums and products of its roots.
In cosmology, the "Axis of Evil" refers to an observed alignment of large-scale structures in the cosmic microwave background (CMB) radiation and the distribution of galaxies. The term was popularized after the analysis of the CMB data from the Wilkinson Microwave Anisotropy Probe (WMAP), which suggested that there might be unusual patterns indicating that certain directions in the universe appear to be statistically significant and aligned with the solar system.
The term "inflaton" refers to a hypothetical field or particle that is proposed to be responsible for the rapid expansion of the universe during a phase of cosmic inflation. This inflationary period is thought to have occurred in the very early universe, approximately 10^-36 to 10^-32 seconds after the Big Bang. According to inflationary theory, the universe underwent an exponential expansion that smoothed out any initial irregularities and set the stage for the large-scale structure we observe today.
Bond order is a concept in chemistry that refers to the number of chemical bonds between a pair of atoms. It is an indicator of the stability and strength of a bond: the higher the bond order, the stronger and shorter the bond.
Channel capacity is a fundamental concept in information theory that represents the maximum rate at which information can be reliably transmitted over a communication channel. More specifically, it refers to the highest data rate (measured in bits per second, bps) that can be achieved without significant errors as the length of transmission approaches infinity. The concept was introduced by Claude Shannon in his seminal 1948 paper "A Mathematical Theory of Communication.
The Common Data Model (CDM) is a standardized data framework that provides a common definition and structure for data across various applications and systems. It is primarily used to enable data interoperability, enhance data sharing, and simplify the process of integrating disparate data sources. CDM is particularly useful in industries such as healthcare, finance, and education, where managing and analyzing data from multiple sources is crucial.
A communication channel refers to the medium or method used to convey information between individuals or groups. It can encompass a wide range of formats and tools, including: 1. **Verbal Communication**: This includes face-to-face conversations, phone calls, video conferences, and speeches. 2. **Written Communication**: This includes emails, text messages, letters, reports, and social media posts.
Cooperative MIMO (Multiple Input Multiple Output) is a wireless communication technique that enhances the performance of MIMO systems by enabling cooperation among multiple users or nodes in a network. Traditional MIMO relies on multiple antennas at both the transmitter and receiver ends to increase capacity and improve signal quality. Cooperative MIMO extends this concept by allowing different users to jointly transmit and receive signals by leveraging their individual antenna resources.
The term "formation matrix" can refer to different concepts depending on the context in which it is used. Here are a few interpretations: 1. **Mathematics and Linear Algebra**: In a mathematical context, a formation matrix can refer to a matrix that represents various types of transformations or formations in geometric or algebraic problems. For example, a formation matrix could be used to describe the position of points in a geometric figure or the relationship between different vectors.
Fano's inequality is a result in information theory that provides a lower bound on the probability of error in estimating a message based on observed data. It quantifies the relationship between the uncertainty of a random variable and the minimal probability of making an incorrect estimation of that variable when provided with some information. More formally, consider a random variable \( X \) with \( n \) possible outcomes and another random variable \( Y \), which represents the "guess" or estimation of \( X \).
An index of information theory articles typically refers to a curated list or database of academic and research articles that focus on information theory, a branch of applied mathematics and electrical engineering that deals with the quantification, storage, and communication of information. Such indexes can help researchers, students, and practitioners find relevant literature on various topics within information theory, including but not limited to: 1. **Fundamental Principles**: Articles discussing the foundational concepts, like entropy, mutual information, and channel capacity.
Human Information Interaction (HII) is a multidisciplinary field that explores how people interact with information, technology, and each other. It encompasses various aspects of human behavior, cognition, and design principles related to the retrieval, processing, and usage of information. The goal of HII is to enhance the effectiveness and efficiency of information interactions, ensuring that users can access, comprehend, and apply information in meaningful ways.
Hyper-encryption is not a widely recognized term in the field of cryptography or computer security as of my last update in October 2023. However, the term could be interpreted in several ways based on the components of the word "hyper" and "encryption." 1. **Advanced Encryption Techniques**: It might refer to highly sophisticated encryption methods that go beyond traditional encryption standards, perhaps incorporating multiple layers of encryption or utilizing advanced algorithms that enhance security.
Information behavior refers to the ways in which individuals seek, receive, organize, store, and use information. It encompasses a wide range of activities and processes that people engage in to find and utilize information in their daily lives, whether for personal, professional, academic, or social purposes. Key aspects of information behavior include: 1. **Information Seeking**: The processes and strategies individuals use to locate information.
Information content refers to the amount of meaningful data or knowledge that is contained within a message, signal, or system. In various fields, it can have slightly different interpretations: 1. **Information Theory**: In information theory, established by Claude Shannon, information content is often quantified in terms of entropy. Entropy measures the average amount of information produced by a stochastic source of data. It represents the uncertainty or unpredictability of a system and is typically expressed in bits.
"Quantities of information" often refers to the measurement of information, which can be quantified in several ways depending on the context. Here are some key concepts and methodologies associated with this term: 1. **Bit**: The basic unit of information in computing and information theory. A bit represents a binary choice, like 0 or 1. 2. **Byte**: A group of eight bits; a common unit used to quantify digital information, typically used to represent a character in text.

Pinned article: Introduction to the OurBigBook Project

Welcome to the OurBigBook Project! Our goal is to create the perfect publishing platform for STEM subjects, and get university-level students to write the best free STEM tutorials ever.
Everyone is welcome to create an account and play with the site: ourbigbook.com/go/register. We belive that students themselves can write amazing tutorials, but teachers are welcome too. You can write about anything you want, it doesn't have to be STEM or even educational. Silly test content is very welcome and you won't be penalized in any way. Just keep it legal!
We have two killer features:
  1. topics: topics group articles by different users with the same title, e.g. here is the topic for the "Fundamental Theorem of Calculus" ourbigbook.com/go/topic/fundamental-theorem-of-calculus
    Articles of different users are sorted by upvote within each article page. This feature is a bit like:
    • a Wikipedia where each user can have their own version of each article
    • a Q&A website like Stack Overflow, where multiple people can give their views on a given topic, and the best ones are sorted by upvote. Except you don't need to wait for someone to ask first, and any topic goes, no matter how narrow or broad
    This feature makes it possible for readers to find better explanations of any topic created by other writers. And it allows writers to create an explanation in a place that readers might actually find it.
    Figure 1.
    Screenshot of the "Derivative" topic page
    . View it live at: ourbigbook.com/go/topic/derivative
  2. local editing: you can store all your personal knowledge base content locally in a plaintext markup format that can be edited locally and published either:
    This way you can be sure that even if OurBigBook.com were to go down one day (which we have no plans to do as it is quite cheap to host!), your content will still be perfectly readable as a static site.
    Figure 2.
    You can publish local OurBigBook lightweight markup files to either https://OurBigBook.com or as a static website
    .
    Figure 3.
    Visual Studio Code extension installation
    .
    Figure 4.
    Visual Studio Code extension tree navigation
    .
    Figure 5.
    Web editor
    . You can also edit articles on the Web editor without installing anything locally.
    Video 3.
    Edit locally and publish demo
    . Source. This shows editing OurBigBook Markup and publishing it using the Visual Studio Code extension.
    Video 4.
    OurBigBook Visual Studio Code extension editing and navigation demo
    . Source.
  3. https://raw.githubusercontent.com/ourbigbook/ourbigbook-media/master/feature/x/hilbert-space-arrow.png
  4. Infinitely deep tables of contents:
    Figure 6.
    Dynamic article tree with infinitely deep table of contents
    .
    Descendant pages can also show up as toplevel e.g.: ourbigbook.com/cirosantilli/chordate-subclade
All our software is open source and hosted at: github.com/ourbigbook/ourbigbook
Further documentation can be found at: docs.ourbigbook.com
Feel free to reach our to us for any help or suggestions: docs.ourbigbook.com/#contact