Techno-Optimism

Marc Andreessen, the co-founder of Netscape and celebrity venture capitalist, recently published what he calls The Techno-Optimist Manifesto. The text advocates for Techno-Optimism, emphasizing the potential of technology and markets to bring about progress and abundance. Andreessen argues against pessimism and negative views of technology, challenging prevalent criticisms of the tech industry. He argues that it has been a driving force behind human progress and should be celebrated rather than feared.

This type of binary thinking is common among technologists, and the text is riddled with these false dilemmas — good or bad, growth or death. It reads like a privileged man attempting to coerce people into believing the very things that help increase his vast wealth. Reality is far more nuanced; technology should be celebrated and feared.

"We believe technology is a lever on the world – the way to make more with less."

"Technological innovation in a market system is inherently philanthropic."

While technology can be a great leveler, thus far, the vast majority of technological advancements and economic growth are disproportionately enjoyed by a small segment of the population, leading to increased inequality. This can exacerbate social divisions and leave many people behind. Andreessen touts the idea that most of the benefits go to society, ignoring the reality that we’ve witnessed the pugnacious startups of the last decade become some of the most powerful institutions in the world — often wielding more power than governments. While it is true that society as a whole often benefits from technological advances, that increase in power is not apportioned evenly.

Ultimately, the manifesto fails to make the case that the tech industry doesn't deserve the scrutiny it is facing. Andreessen is angry about the tech backlash over the last several years yet completely fails to address the fact that most of the damage is self-inflicted. Andreessen Horowitz was an early investor in Facebook, a company with the idealistic goal of “connect[ing] every person in the world.” Years later, it would be revealed that Facebook allowed digital consultant Cambridge Analytica to misuse the data of millions of users. Unfortunately, this was not an isolated incident, and Facebook has been a nearly bottomless pit of scandal over the last decade. Experience should emphasize the need for a more balanced and nuanced approach, taking into account the potential drawbacks and unintended consequences of unchecked technological progress.

Progress is a tightrope act — a deft balance between opposing forces of innovation and accountability that gently bend the future upward for all.

Château Pontet-Canet 2010

It was a year of unusual weather. The 2010 Bordeaux vintage got off to a late start after an especially frigid winter. A wet June gave way to a hot and particularly arid summer, just the right conditions to concentrate the grapes. Cool, damp nights at the start of Fall helped bolster acidity, creating a nicely balanced vintage despite lower yields.

The 2010 Pontet-Canet is a classic Pauillac with a captivating blackcurrant liqueur, rose petals, and violet aroma. It is a brooding crimson in color, thinning slightly toward the edge. The wine is intense and complex while retaining a sense of elegance that prevents it from feeling heavy or tiring. This Bordeaux shows notes of fig, blackberry, cherry, and currant on the palate. The long, satisfying finish brings flavors of leather and tobacco. A tertiary note of earthy mushrooms is faint but unmistakably present.

Offering remarkable depth and complexity, the 2010 vintage of Pontet-Canet might be one of the great wines of this century. I've tasted a good share of Premier Cru Bordeaux, and this wine offers serious competition to the five legendary estates. I look forward to tasting this wine again alongside the 2019 vintage in another decade.

A Lesson of Death and Beauty

I have a love/hate relationship with vintage wine, but the very traits I have come to hate are also the source of my passion. These opposite yet interconnected forces, this frustrating duality, came into focus when a Sommelier recently opened one of my bottles of 1981 vintage Chateau Leoville Las Cases Bordeaux.

This bottle had come to the end of a long journey. Forty-two years ago, the vineyard's grapes were carefully tended to over an entire growing season, hand-picked, sorted, and processed. Some of the hands that picked the grapes likely belong to people who have since passed. These grapes were survivors of the deluge of rain that consumed the first half of October that year. The bottle was cellared for decades in a temperature-controlled environment by multiple owners. There were thousands of opportunities for a mishap, but there it was forty-two years later, sitting on the bar of my favorite local restaurant. Cutting the foil wrapper revealed a white powdery substance overtaking the liquid-soaked cork, an ominous foreshadowing of what would come. The wine exhibited an initial hint of mustiness with a short, funk-laced whisper of cassis. It was the taste of oenological expiration. At an unknown time within the last four decades, the wine had died.

My reaction was not disappointment or irritation but a general sense of loss. This wine was painstakingly crafted by a team of passionate people for the purpose of bringing joy, and it never had the chance to realize this goal. Instead, it served as an austere reminder of time's relentless flow, a poignant lesson not to squander our singular opportunity to bring a measure of joy to those around us, and a warning of the precious immediacy to life. Time is slowly consuming us all, and like this bottle of wine, we have but one chance to leave our mark.

Perhaps this is the very source of my passion for wine. Even many of our happiest moments are laced with a sense of melancholy because we know it can't last forever. The emotional power is drawn from this very duality because it's the contrast of one that provides vibrance for the other – light and shadow, life and death.

Château Pontet-Canet 2019

My pilgrimage through the landscape of French viticulture continues with another vintage of Château Pontet-Canet. I previously tasted the 2006 vintage, and though I enjoyed it, the 2019 provides an entirely different tasting experience that very closely matches my conceptual ideal of what a Bordeaux should be.

The 2019 Pontet-Canet bursts with black currant, plum, and dark chocolate. Notes of tobacco and cedar gently penetrate the base of dark fruit in the long finish. The wine possesses an uncanny lightness on the palate despite exhibiting intense, rich flavors. Interestingly, this wine's most unique aspect may be how it effortlessly navigates this apparent contradiction.

The tannins are slightly sharp and decoupled, but I’m not surprised, given the youth of this vintage. Given a decade in the cellar, the 2019 Pontet-Canet may be close to perfect.

Rethinking Biases: Concatenation and String Builder

Everybody knows that string builder classes are more efficient than concatenation, right? Statements like this are passed between generations of developers, quickly becoming common wisdom. Like all things in technology, rapid language evolution can render dated information irrelevant. So, in the context of Apex development, does this piece of common wisdom hold up? I was recently tasked with a project that required assembling a massive amount of string data into a large JSON payload — it presented the perfect opportunity to put this claim to the test. The answer? Well, it depends, but it was not what I expected. To tightly control as many variables as possible, I created a short code snippet to concatenate two identical strings of a precise size using both techniques. As anticipated, the string builder technique is faster while utilizing few CPU resources with large strings; however, basic concatenation wins in both speed and efficiency when used for smaller tasks.

5,000 Iterations 50,000 Iterations
Concatenation 241 ms 4715 ms
String Builder 445 ms 4614 ms

In fact, concatenation maintains a speed advantage up to a surprisingly high number of iterations. The chart below shows that concatenation holds the lead until just after 55,000 iterations! So, what's the verdict? Basic concatenation is faster under most circumstances. Only extremely large strings benefit from the string builder technique.

Chicago 2023

I had seen it hundreds of times, but now it was simultaneously familiar and foreign. It was as if I was looking at it for the first time. It shimmered as I walked around the room—the lights catching the deep contours of each brush stroke. His face possessed uncanny depth, and he appeared to come alive. His piercing gaze stared through me as if I didn't exist. The longer I stared into his eyes, the more expression I could glean from his gaunt face. He appeared profoundly sad with a tinge of resignation, hinting that this emotion was not unfamiliar. My experience at the Chicago Art Institute's Vincent van Gough exhibit echoed my broader sense of the trip; the city was familiar, but I viewed it as a stranger.

I aimed to look past the trivial details and capture broad shapes given form by the interplay with light—to distill the city's architecture to its essence.

Fratelli Giacosa Basarin Vigna Gianmaté 2015

I have a complicated relationship with the Fratelli Giacosa Basarin Vigna Gianmaté. After my first sip, I contemplated pouring the remainder of the bottle down the drain. Like any good narrative, this bottle contained a plot twist with an unexpected outcome. This Barbaresco is an excellent example of a high-quality wine that doesn't show well straight from the bottle but transforms after decanting. Immediately after opening, the wine is completely out of balance, possessing an impenetrable wall of oak, vanilla, tobacco, and earth that masks any hint of fruit. I consider decanting this wine to be absolutely essential, so my tasting notes describe my experience after allowing one hour of aeration.

Decanting the wine puts the intense garnet red color on full display. There is more sediment than expected for a wine possessing less than a decade of age. The Basarin Vigna Gianmaté 2015 tastes predominantly of cherry, bolstered with an assertive background of oak and vanilla and an earthy finish. The wine is well structured with ample acidity to balance the tannins.

Château Pontet-Canet (2006)

Bordeaux wine is often considered a symbol of elegance, sophistication, and complexity. Nestled in southwest France, Bordeaux is renowned for producing some of the world's most coveted wines. The complexity of Bordeaux lies in the intricate balance of flavors, aromas, and textures unique to each wine. The region's rich history, diverse geography, and meticulous winemaking techniques all contribute to the wine's complex and multifaceted nature. Bottles of wine from this region have a reputation for eye-watering prices; however, many Bordeaux wines offer excellent value, offering 90% of a coveted first growth at a fraction of the cost. Château Pontet-Canet has long been one of my favorites, offering fantastic wine with quality that remains consistent between vintages. The wines produced by Château Pontet-Canet are renowned for their robust flavors, complexity, and exceptional aging potential.

Château Pontet-Canet is a fifth-growth classified estate that has a history that dates back to the early 18th century. Located in the Pauillac appellation in the Bordeaux region of France, the Tesseron family has owned the estate since 1975. I've been particularly captivated by the story of Alfred Tesseron, the current owner who took charge of the estate in 1994. His passion for organic and biodynamic farming makes him a visionary leader in the conservative region of Bordeaux.

The 2006 vintage of Pontet-Canet is somewhat undervalued, given that the year presented a challenging growing season. The wine still tastes young despite having a bit of age. It remains concentrated, with assertive notes of blackberry, plum, and currant. The present but integrated tannins give way to a long, satisfying finish. It doesn't match the 2010 vintage that Robert Parker scored a perfect 100 points, but it isn't too far behind at half the cost.

Archetype El Vergel Estates Gesha 240 Horas - Competition Series

The El Vergel Estates Gesha 240 Horas is a coffee that comes with a story. This year, Archetype Coffee competed in two competitions. Archetype’s owner, Isaiah Sheese, won the United States Barista Championship. Jesus Iniquez, one of Archetype’s most skilled baristas, placed fourth in the United States Brewers Cup Championship. This is the coffee Jesus selected to compete with. Like nearly all top quality competition coffee, the El Vergel Gesha was available in limited quantity, with only 80 227 gram bags available.

The coffee really shines as a pour over, with notes of black cherry and tropical fruit giving way to floral undertones. Rose hips and moderately dark chocolate dominate the long, evolving finish while the tropical fruit lingers on the palate. This coffee is complex, and it’s obvious why it was selected for competition.

Despite being brewed using a filter in competition, the coffee also shows very well as espresso. A fairly flat seven bar profile with short pre-infusion brings out the vibrance and sweetness in the tropical fruit while providing a balanced acidity.

Espresso

Bean Weight 18 g
Brew Time 26 sec.
Pressure 7 bar
Water Temperature 91°C
Yield 40 g

Filter (Origami)

Bean Weight 18 g
Brew Time 2:10
Water Temperature 96°C
Yield 280 g

Interpretability in Machine Learning

Since OpenAI released its large language model (LLM) chatbot, ChatGPT, machine learning, and artificial intelligence have entered mainstream discourse. The reaction has been a mix of skepticism, trepidation, and panic as the public comes to terms with how this technology will shape our future. Many fail to realize that machine learning already shapes the present, and many developers have been grappling with introducing this technology into products and services for years. Machine learning models are used to make increasingly important decisions – from aiding physicians in diagnosing serious health issues to making financial decisions for customers.

How it Works

I strongly dislike the term "artificial intelligence" because what the phrase describes is a mirage. There is no complex thought process at work – the model doesn't even understand the information it is processing. In a nutshell, OpenAI's model powering ChatGPT calculates the statistically most probable next word given the immediately surrounding context based on the enormous amount of information developers used to train the model.

A Model?

Let's say we compiled an accurate dataset containing the time it takes for an object to fall from specific heights:

Height Time
100 m 4.51 sec
200 m 6.39 sec
300 m 7.82 sec
400 m 9.03 sec
500 m 10.10 sec

What if we need to determine the time it takes for that object to fall from a distance we don't have data for? We build a model representing our data and either interpolate or extrapolate to find the answer:

{\displaystyle \ t=\ {\sqrt {\frac

Models for more complex calculations are often created with neural networks, mathematical systems that learn skills by analyzing vast amounts of data. A vast collection of nodes evaluate a specific function and pass the result to the next node. Simple neural networks can be expressed as mathematical functions, but as the number of variables and nodes increase, the model can become opaque to human comprehension.

The Interpretability Problem

Unfortunately, opening many complex models and providing a precise mathematical explanation for the decision is impossible. In other words, models often lack human interpretability and accountability. We often can't say, mathematically speaking, exactly how the network makes the distinction it does; we only know that its decisions align with those of a human. It doesn't require a keen imagination to see how this presents a problem in regulated, high-stakes decision-making.

Let's say John visits a lender and applies for a $37,000 small business loan. The lender needs to determine the probability that John will default on the loan, so they feed John's information into an algorithm, which computes a low score causing a denial. By law, the lender must provide John with a statement of the specific reasons for the denial. In this scenario, what do we tell John? Today, we can reverse engineer the model and provide a detailed answer, but even simple models of tomorrow will quickly test the limits of human understanding as computing resources become more powerful and less expensive. So how do we design accountable, transparent systems in the face of exponentially growing complexity?

Solutions?

Proponents of interpretable models suggest limiting the number of variables used in a model. The problem with this approach becomes apparent after considering how neural networks weigh variables. Models multiply results by coefficients that determine the relative importance of each variable or calculation before passing them to the next node. These coefficients and variables are often between 20 and 50 decimal places long, containing positive and negative numbers. While understanding the data underpinning a decision is essential, more is needed to truly elucidate a clear explanation. We can partially solve this problem by building tooling to abstract implementation details and provide a more intelligible overview of the model; however, this still only provides an approximation of the decision-making process.

Other thought leaders in machine learning argue that the most viable long-term solutions may not involve futile attempts to explain the model but should instead focus on auditing and regulating performance. Do large volumes of test data reveal statistical trends of bias? Does analyzing the training data show any gaps or irregularities that could result in harm? Unfortunately, this does not solve the issue in my hypothetical scenario above. I can't conclusively prove that my current decision was correct by pointing to past performance.

Technology is simply moving too rapidly to rely on regulations, which are, at best, a lagging remedy. We must pre-emptively work to build explainability into our models, but doing this in an understandable and actionable way will require rethinking our current AI architectures. We need forward-looking solutions that address bias at every stage of the development lifecycle with strong internal governance. Existing systems should undergo regular audits to ensure small changes haven't caused disparate impacts.

I can't help but feel very lucky to live in this transformative sliver of time, from the birth of the personal computer to the beginning of the internet age and the machine learning revolution. Today's developers and system architects have a massive responsibility to consider the impact of the technology they create. The future adoption of AI heavily depends on the trust we build in our systems today.