My Ten Year Vision for the World

A Communication Revolution

We're at the beginning of a revolution in communication. Language is an aspect of human exceptionalism. It serves as a long term memory for our species. It allows us to encapsulate complex ideas into a few words. It's extremely powerful. In the future we'll have access to the entirety of human knowledge at our fingertips, instantly, at birth. We'll be able to access this knowledge through a combination of language and technology.

The convergence of language and technology is not just a distant dream; it's rapidly becoming our reality. As we integrate AI into our daily lives, the way we interact with information is fundamentally changing. The AI revolution promises to enhance our cognitive abilities, enabling us to process and understand vast amounts of data with unprecedented speed and clarity.

Imagine a world where language barriers are non-existent, where AI can instantly translate between tongues with perfect accuracy. Education will be transformed as personalized AI tutors adapt to each student's learning style, making the acquisition of knowledge faster and more enjoyable. In professional settings, decision-making will be augmented by AI that can provide real-time insights and analyses.

However, this future comes with its own set of challenges and ethical considerations. Privacy concerns, the digital divide, and the potential for misuse of AI are issues that need to be addressed. As developers and users of AI, it is our responsibility to guide this technology towards a future that enhances human capabilities without compromising our values.

The Tools of the Future

Self Knowledge and Awareness

People will be able to view their personal information instantly to discover themselves in the context of humanity. We will use a combination of bibliographical writing and future authoring to compose our personal knowledge base.

Self Relationship

We will relate our personal context to other's contexts to discover symbiotic relationships. We will discover new possible relationships in seconds, and have the ability to exchange a compressed and easily searchable knowledge base about ourselves. This will empower us to work independently, while also benefiting from collaboration.

Access to All Existing Knowledge

General knowledge will become a utility. We will have access to the latest knowledge base, updated frequently, and containing all of human knowledge.

High Dimensional Societal Structure

Human society will be organized ontologically around all human knowledge. Every relationship will be located within the context of the entire relational graph of human knowledge.

Knowledge Gaining Tools

As we progress in our understanding of everything we will grow the corpus of knowledge. New observations of phenomena will be encoded so that we can place them in the larger context of verified knowledge. These concrete and abstract placements will help us discover potential transferable approaches to new gain new understanding.

Steps to the Future

Create knowledge buckets

We will need buckets of knowledge for each context:

  • Personal
  • Business
  • Society
  • World

Much of this work is already being done via current web 2.0:

  • Social media, and online document storage
  • Business tools: Confluence, slack, Github
  • Central knowledge hubs: Arxiv, Quora, Wikipedia, etc...

The current revolution in AI, especially generative AI, is based on an AI architecture called a Transformer.


A transformer is a type of neural network architecture used in machine learning and natural language processing. It's particularly popular because it's excellent at handling sequential data, like text and speech, and it's known for its parallel processing capabilities.

At its core, a transformer consists of two main parts:

Encoder: The encoder takes input data, such as a sentence in a natural language, and converts it into a numerical representation. It does this by breaking the input into smaller parts called tokens and then processes them in parallel. Each token is embedded into a high-dimensional vector, which retains information about the token's meaning and position in the sequence. The encoder also uses self-attention mechanisms to capture relationships and dependencies between different tokens in the input sequence.

Decoder: The decoder takes the numerical representation generated by the encoder and uses it to produce an output sequence, often in the same or another language. Like the encoder, the decoder also uses self-attention mechanisms but with a slight twist. It pays attention to the input sequence (the encoder's output) and its own previously generated output tokens. This helps it generate the output tokens one at a time while considering the context from both the input and the generated parts of the output.

One of the key innovations in the transformer architecture is the multi-head self-attention mechanism. It allows the model to focus on different parts of the input sequence simultaneously and learn complex relationships between words or tokens. This parallelism makes transformers highly efficient and powerful for a wide range of tasks, including machine translation, text generation, sentiment analysis, and more.

Additionally, transformers are known for their ability to handle sequences of variable length, making them versatile for various natural language processing tasks. They have become the backbone of many state-of-the-art models in the field.

In summary, a transformer is a neural network architecture that excels at processing sequential data, thanks to its parallelism and self-attention mechanisms. It's widely used in natural language understanding and generation tasks and has played a significant role in advancing the capabilities of machine learning models.

Context Needed

Transformers need context. They use this context to focus their attention on specific knowledge, look for relationships between every word, meaning, and concept, and then predict concepts, meanings, and words to complete the context.

I'm seeing tools now that help

First, we need instantaneous access to this new knowledge service. There are several forms this can take. One form is something like the Neuralink, an implanted chip that monitors our brain activity to gather context.

In conclusion, the integration of AI with language is not just enhancing communication; it's reshaping our very existence. By embracing this change, we can unlock the full potential of human knowledge and creativity, paving the way for an enlightened era of innovation and progress.