Mary Wootters is a multi-disciplinary artist and educator based in New York City. She is known for her work in various mediums, including sculpture, painting, and installation art. Her practice often explores themes such as identity, memory, and the human experience, frequently utilizing found objects and unconventional materials to challenge traditional artistic boundaries. In addition to her artistic work, Wootters has been involved in teaching and mentoring emerging artists, sharing her expertise and fostering creativity in others.
Michael Sipser is a prominent computer scientist and educator, known primarily for his contributions to the fields of theoretical computer science and complexity theory. He is the author of the widely used textbook "Introduction to the Theory of Computation," which covers topics such as formal languages, automata theory, computability, and complexity theory. Sipser has held various academic positions, including serving as a professor at the Massachusetts Institute of Technology (MIT).
Michael W. Shields could refer to several individuals, as it's not a unique name, but one notable person is an academic known for his work in economics, particularly in the field of health economics and education economics. He has contributed to various studies and published research in those areas. If you were referring to a different Michael W. Shields or need specific information about a certain context, please provide more details!
Nachum Dershowitz is an acclaimed mathematician and computer scientist known for his work in various fields, including theoretical computer science and mathematics. He has made significant contributions to topics such as algorithms, computational complexity, and the foundations of mathematics. Additionally, he has authored or co-authored numerous papers and may have developed influential theories or models in his areas of expertise.
Nick Pippenger is a notable figure in the field of computer science, particularly known for his contributions to algorithms and the theory of computation. He is recognized for his work on topics such as circuit complexity and the development of efficient algorithms. One of his significant contributions includes the Pippenger–Beame–Hennessy theorem, which addresses time-space trade-offs in computational complexity.
Noam Nisan is a prominent computer scientist known for his contributions to various fields within theoretical computer science, economics, and game theory. He is particularly recognized for his work on algorithmic game theory, which combines ideas from computer science and economics to understand strategic interactions in computational settings. Nisan has authored and co-authored numerous influential papers and has been involved in the development of concepts such as mechanism design and auctions within this interdisciplinary framework.
As of my last knowledge update in October 2023, Patricia Bouyer-Decitre is a French mathematician known for her work in the fields of mathematical analysis and partial differential equations. She has contributed to various areas, including mathematical modeling and numerical analysis. Additionally, she is recognized for her involvement in promoting mathematics and science education.
Ran Libeskind-Hadas is a prominent computer scientist and educator, known primarily for his work in the field of computer science education, algorithms, and data structures. He often focuses on innovative teaching methods and curriculum development to enhance learning experiences. Libeskind-Hadas has contributed to various research areas, and his work often includes interdisciplinary approaches to problem-solving in computer science. His dedication to education has made him a notable figure in discussions on how to effectively teach complex concepts in a clear and engaging manner.
Ray Solomonoff (1934–2018) was an American scientist and a pioneer in the fields of algorithmic information theory and artificial intelligence. He is best known for developing the theory of algorithmic probability, which is a formal approach to the concept of randomness and information content.
Richard Lipton is a prominent computer scientist known for his contributions to theoretical computer science, particularly in the areas of algorithms, complexity theory, and cryptography. He is a professor at the Georgia Institute of Technology and has authored numerous research papers and publications in his field. Lipton is also known for his work on the P vs NP problem, a fundamental question in computer science that addresses the relationship between problems that can be solved quickly (in polynomial time) and those for which solutions can be verified quickly.
Wilfried Brauer is a notable figure in the field of applied mathematics and computer science, particularly known for his work in the areas of numerical methods, mathematical modeling, and the development of algorithms. He has contributed to the study of various mathematical problems and is often recognized for his impact on both academic research and practical applications.
Seinosuke Toda is a Japanese actor known for his work in various films and television dramas. His performances have garnered him recognition in the entertainment industry in Japan.
Solomon Marcus was a prominent Romanian mathematician, known for his significant contributions to various fields of mathematics, including functional analysis, mathematical linguistics, and automata theory. Born on March 1, 1925, he was not only a distinguished researcher but also an influential educator who played a key role in the development of mathematics education in Romania. His work often bridged disciplines, connecting mathematics with computer science and literature.
Relative nonlinearity is a concept that often arises in the context of optics and materials science, particularly when discussing the nonlinear optical properties of materials. It refers to a comparison of the nonlinear response of a medium to the linear response, typically in the context of the refractive index or other properties. In nonlinear optics, materials can exhibit a nonlinear response to electromagnetic fields, meaning that their properties change in a nonlinear manner as the intensity of the light increases.
Suresh Venkatasubramanian is a notable figure in the fields of computer science and data science, particularly known for his work on algorithmic fairness, machine learning, and artificial intelligence. He has been involved in research that addresses the intersection of technology and social issues, focusing on how algorithms can impact society and the ethical implications of their use.
Uzi Vishkin is a computer scientist known for his contributions to parallel computing and algorithms. He is a professor at the University of Maryland, College Park, where he has been involved in research and teaching in these areas. Vishkin is particularly noted for his work on programming languages and parallel computing models, and he has made significant contributions to the development of parallel algorithms that leverage the capabilities of modern multicore and distributed architectures. His research often focuses on improving the efficiency and scalability of algorithms in various applications.
Lexicon-grammar is a linguistic concept that combines two core aspects of language: the lexicon (the inventory of words and their meanings) and grammar (the rules and structures that govern how words combine to form sentences). This term is often associated with the work of French linguist Jean-Pierre Desclés and his approach to understanding the interplay between vocabulary and grammatical structures in language.
Symbol theory is a branch of semiotics, which is the study of signs, symbols, and gestures and their meanings within various contexts. Semiotics itself was significantly developed by theorists like Ferdinand de Saussure and Charles Sanders Peirce, and it involves understanding how meaning is constructed and communicated through signs. In the context of symbol theory, the focus is primarily on symbols—entities that represent or stand in for something else.
Dialetheism is the philosophical position that some contradictions can be true. In other words, it holds that there are statements that are both true and false simultaneously. This perspective challenges classical logic, which adheres to the law of non-contradiction, a fundamental principle stating that a proposition cannot be both true and false at the same time.
Epistemic theories of truth are philosophical approaches that relate the concept of truth to knowledge, belief, and justification. In these theories, truth is often understood not as a property of statements or propositions in isolation, but in terms of our knowledge of those statements or propositions. Here are some key points about epistemic theories of truth: 1. **Relation to Knowledge**: Epistemic theories assert that truth is fundamentally linked to our epistemic conditions—our beliefs, evidence, and justification.
Pinned article: Introduction to the OurBigBook Project
Welcome to the OurBigBook Project! Our goal is to create the perfect publishing platform for STEM subjects, and get university-level students to write the best free STEM tutorials ever.
Everyone is welcome to create an account and play with the site: ourbigbook.com/go/register. We belive that students themselves can write amazing tutorials, but teachers are welcome too. You can write about anything you want, it doesn't have to be STEM or even educational. Silly test content is very welcome and you won't be penalized in any way. Just keep it legal!
Intro to OurBigBook
. Source. We have two killer features:
- topics: topics group articles by different users with the same title, e.g. here is the topic for the "Fundamental Theorem of Calculus" ourbigbook.com/go/topic/fundamental-theorem-of-calculusArticles of different users are sorted by upvote within each article page. This feature is a bit like:
- a Wikipedia where each user can have their own version of each article
- a Q&A website like Stack Overflow, where multiple people can give their views on a given topic, and the best ones are sorted by upvote. Except you don't need to wait for someone to ask first, and any topic goes, no matter how narrow or broad
This feature makes it possible for readers to find better explanations of any topic created by other writers. And it allows writers to create an explanation in a place that readers might actually find it.Figure 1. Screenshot of the "Derivative" topic page. View it live at: ourbigbook.com/go/topic/derivativeVideo 2. OurBigBook Web topics demo. Source. - local editing: you can store all your personal knowledge base content locally in a plaintext markup format that can be edited locally and published either:This way you can be sure that even if OurBigBook.com were to go down one day (which we have no plans to do as it is quite cheap to host!), your content will still be perfectly readable as a static site.
- to OurBigBook.com to get awesome multi-user features like topics and likes
- as HTML files to a static website, which you can host yourself for free on many external providers like GitHub Pages, and remain in full control
Figure 3. Visual Studio Code extension installation.Figure 4. Visual Studio Code extension tree navigation.Figure 5. Web editor. You can also edit articles on the Web editor without installing anything locally.Video 3. Edit locally and publish demo. Source. This shows editing OurBigBook Markup and publishing it using the Visual Studio Code extension.Video 4. OurBigBook Visual Studio Code extension editing and navigation demo. Source. - Infinitely deep tables of contents:
All our software is open source and hosted at: github.com/ourbigbook/ourbigbook
Further documentation can be found at: docs.ourbigbook.com
Feel free to reach our to us for any help or suggestions: docs.ourbigbook.com/#contact





