A Novel Journey: Fiction as an Empathy Machine
The characters in my unfinished novel provide an unexpected life lesson in empathy.
I thought writing fiction would be easier. After all, I've read hundreds of novels, dissected story arcs, and studied character development. But I've discovered that crafting convincing fiction demands something far beyond technical skill: it requires an exhausting emotional intelligence. Creating believable characters isn't just about inventing biographical details or crafting clever dialogue — it's about fully inhabiting another consciousness. Each day at my desk feels like an acting exercise gone deep, as I struggle to think, feel, and react as someone fundamentally different from myself. It's proving to be one of the most emotionally and intellectually demanding things I've ever done.
Writing fiction, at its core, is an exercise in empathy. It compels the author to inhabit the thoughts, emotions, and experiences of people not like them – sometimes people who are wildly different, morally ambiguous, or outright reprehensible. A well-crafted story forces us to confront and find understanding in the actions and motivations of others. Storytelling functions as an empathy machine, forcing us to see the world through another’s eyes. To write well, I can’t observe characters from a safe distance. I have to fully assume their psyche – the character’s most personal thoughts, personality flaws, and deepest motivations must be comprehensively studied.
Writers must temporarily suspend their own judgments, beliefs, and moral frameworks to fully understand their characters' choices. Even when writing villains or characters whose actions we deplore, we must find that thread of human truth that makes their motivations comprehensible, if not justifiable. This deliberate practice of perspective-taking shapes not just the characters, but transforms the writer, expanding our capacity for understanding the complexities of human nature.
In our current era of polarization and tribal thinking, where social media algorithms and echo chambers reinforce our existing worldviews, this kind of radical empathy feels more crucial than ever. Perhaps what we need isn't just more stories, but more storytellers – more people willing to engage in the challenging work of inhabiting perspectives vastly different from their own. The skills required for fiction writing – deep listening, suspension of judgment, and genuine curiosity about different viewpoints – might be exactly what our fractured society needs to begin healing.
Thanks to a good friend for the thought-provoking conversation that led to this realization. You know who you are, and your friendship is appreciated more than you know.
A Novel Journey: The Unwritten Story That Found Me
What began as a short story quickly overtook me. It is a story that refuses to be contained – an organic being that is transforming before my eyes.
It began like all the others — with a short story. I had a drawer full of them: a few pages here, a scene there. Little vignettes, like windows cracked open to my subconscious, offering brief glimpses into half-formed worlds.
But this one was different. While the others rested quietly in their drawer — content to remain fragments of possibility — this one refused containment. Its characters whispered to me when I least expected — during walks, in the shower, just before sleep — hinting at histories I hadn’t written but somehow already knew. The world stretched past the page, bleeding into my life, unfolding scenes and conflicts too vast for a few thousand words to contain.
A single thread began to weave itself into something intricate and unruly. Each morning brought fresh connections, unresolved questions, characters who demanded to be known. Each morning I'd wake to find new connections forming, new questions demanding answers. What should have been a week-long affair stretched into months. The story transformed before my eyes — no longer a short piece, but something vast and breathing. My first novel.
I hadn’t planned for this journey. I wasn’t prepared for how it would consume me — how it would upend my assumptions, test my discipline, and quietly redefine who I was as a writer.
This piece is the first installment in A Novel Journey, a series chronicling my experience writing my first novel — the unexpected challenges, small breakthroughs, and all the moments in between. If you’ve ever tried to wrestle a story into being, I hope these reflections resonate with you.
Next up: My characters teach me a life lesson in empathy.
The Path to Profitability
The next trillion-dollar tech company won't be built by creating the best AI model - it will be built by controlling how, where, and why people use AI.
The next trillion-dollar tech company won't be built by creating the best AI model - it will be built by controlling how, where, and why people use AI. Just as Microsoft didn't win the PC era by making the best chips, today's AI leaders are racing not just to build better models, but to become the indispensable platform through which AI is accessed and deployed. Last quarter, over 50% of all venture capital funding went to AI-focused companies, totaling more than $60 billion of investment. The extreme rate with which VC firms are pouring money into Artificial Intelligence begs an important question: what is the path to profitability for companies like OpenAI?
AI startups have a relatively straightforward revenue model – subscription-based pricing for access to cutting-edge models and usage-based pricing for access to the API. Extremely high R&D and capital costs make attracting more paid users a requirement, but doing this also increases their marginal costs of inference. Therefore, paid users must offset their own marginal costs while subsidizing the costs of free users before beginning to chip away at those massive fixed costs that investors are currently covering. How can these aggressive revenue numbers be achieved without the assistance of external capital?
OpenAI and the other labs creating foundation models have no competitive moat and few products that fit wide swaths of the market. They have APIs that allow developers to create products, but that further commoditizes their main strength.
The Moat of the Personal Computer Revolution
Let’s look at the first transformational change in technology to see how our current future might play out. Much like our foundation models today, the integrated circuit and transistor were commodities. The moat that helped lock in decades of Windows/Intel hegemony was Microsoft’s platform deals.
It wasn’t the hardware itself that dictated dominance, but the strategic positioning of software as a control point—Windows became the layer through which users interacted with computing, and Intel became the default engine beneath. The real power lay not in the invention, but in the ecosystem built around it: developer tools, third-party software, enterprise integrations, and, most critically, distribution deals that ensured Windows shipped on nearly every PC.
Foundation models may be the new transistors—powerful, essential, but increasingly commoditized. The question becomes: who builds the new “Windows” for AI? Will it be a developer platform, a ubiquitous interface layer, or a vertically integrated product experience? The companies that succeed in building sticky platforms around foundation models—whether through proprietary data, user workflows, or ecosystem lock-in—may become the long-term leaders in artificial intelligence.
It’s interesting to note the trend of OpenAI enhancing its API with features typically reserved for subscribers at a faster pace. They’re also adding features to their API that will make it more difficult for developers to switch model providers. Look no further than how their new responses API compares to the original chat completions API. The new API is stateful, which makes it both easier to build solutions with and far more difficult to switch out with the API of another provider.
But there’s another layer to this. We’re seeing the early signs of this platform consolidation. Just as Microsoft leveraged pre-installation deals and developer incentives to make Windows indispensable, today’s AI leaders are racing to become the default platform on which others build. Microsoft’s investment in OpenAI, Anthropic’s partnership with Amazon, and Google’s embedding of Gemini into its product suite all mirror those earlier moves—where distribution, not just innovation, becomes the true competitive edge. The players who control not just the models, but the channels of user interaction—browsers, operating systems, productivity tools, search, or even chip infrastructure—are best positioned to own the AI era.
We’re entering a phase where control over context becomes as important as control over compute. Just as Windows became the context for productivity and the web browser the context for search, whoever owns the AI context—how, where, and why users invoke intelligence—will shape the future. That’s the real platform play.
Post Determinism: Correctness and Confidence
In traditional software, correctness is binary—code either works or it doesn't. But in the age of AI, we've entered a new paradigm where correctness exists on a spectrum. Discover how modern developers measure and validate AI system outputs, from token probabilities to semantic similarity scores, and learn why narrow, focused AI agents often outperform their general-purpose counterparts.
In traditional software development, correctness is binary: a function either works correctly or it doesn’t. A sort algorithm either orders the list properly or fails. A database query either returns the right records or doesn’t. In the age of AI and large language models, correctness more closely mirrors life and exists on a spectrum. When an AI generates a response, we can't simply assert a correct result. Instead, we must ask "How confident are we in this response?" and "What does correctness even mean in this context?"
Before we explore an implementation of confidence scoring in an application that uses Large Language Models to summarize vast amounts of data, let’s look at a few of the most common scoring metrics.
Token probabilities and top-k most likely tokens are common metrics under the umbrella of model-reported confidence. Given the following prompt:
prompt = 'Roses are red, violets are'
token_scores = compute_token_scores(model, prompt)
score_map = top_k_tokens(model, token_scores, k=3)
print(score_map)
The output might be:
{'▁Blue': 0.0003350259, '▁purple': 0.0009047602, '▁blue': 0.9984743}
In this example, the model has an extremely high probability of selecting _blue
as the next token (after applying softmax normalization, which converts raw model outputs into probabilities that sum to 1). We could say that the model’s confidence is high; however, there are some caveats and limitations. We are simply estimating the probability of two arbitrary tokens, but in some cases, none of the top-k tokens may be likely or relevant. While model-reported confidence plays a role in the overall evaluation of output, it is clear that external validation is needed to ensure accuracy.
One powerful approach to external validation is Semantic Similarity assessment of the semantic resemblance between the generated answer and the ground truth. In short, semantic similarity involves comparing a known accurate answer to the LLM’s output for a given question. The more closely the two align, the more accurate the answer. Using a tool like Ragas, we can easily calculate this score.
from ragas.dataset_schema import SingleTurnSample
from ragas.metrics import SemanticSimilarity
from ragas.embeddings import LangchainEmbeddingsWrapper
sample = SingleTurnSample(
response="The Eiffel Tower is located in Paris.",
reference="The Eiffel Tower is located in Paris. It has a height of 1000ft."
)
scorer = SemanticSimilarity(embeddings=LangchainEmbeddingsWrapper(evaluator_embedding))
await scorer.single_turn_ascore(sample)
Output
0.8151371879226978
A score of 0.81 indicates strong semantic similarity - the model's response captures most of the key information from the reference answer, though it omits the height detail. In this example, the vectorized ground truth answer fairly well resembled the response. While this provides a more accurate scoring method than top-k sampling, it requires a dataset of questions and answers. This dataset must also closely resemble your production data. For example, calculating semantic similarity on a database of questions about English grammar won’t give insight into model performance if its real-world use will be a customer service agent for a bank.
In practice, developers use a combination of these techniques to evaluate model performance. This multi-faceted approach helps mitigate the limitations of any single confidence metric. Limiting the scope of input allows for simpler and more accurate model scoring – scoring results for an application designed to summarize highly structured data from a single knowledge domain will be more reliable than those from an application designed to take any possible unstructured user input. For example, a medical diagnosis system focused solely on radiology reports will likely achieve higher confidence scores than a general-purpose medical chatbot. This is one of the reasons that implementing a series of “agents” that each address a specific, well-defined problem domain and can be tested separately are becoming a popular approach. Ensuring input quality while maintaining domain specificity and low task complexity is the most direct way to obtain high quality output with minimal effort.
Post-Determinism: The End of Predictable Computing
The infusion of AI and LLM capabilities into software development is driving a paradigm shift on multiple levels. Design patterns are evolving – from harnessing in-context learning and retrieval strategies to new agent-based models – making software more adaptive and intelligent by design.
Since the invention of the computer, we’ve operated under a simple premise: computers are predictable machines that precisely follow a defined set of instructions. This fundamental assumption has shaped how we design software and interact with our devices. Artificial Intelligence has fundamentally upended the established 50-year paradigm of deterministic computing. Post-determinism marks a shift from computers as rigid executors of instructions to adaptable, probabilistic systems that generate responses based on learned patterns rather than explicit code. We’re entering an era where computers can interpret, create, and surprise us. This shift from predictable to probabilistic computing isn’t just a technical evolution – it represents a complete transformation in how we must think and interact with technology. Unlike deterministic systems where failure modes are predictable, AI-driven software introduces new risks, including bias, non-deterministic outputs, and emergent behaviors that challenge traditional software engineering principles. Developers must rethink their approach to software design, including reimagining potential use cases in light of these new capabilities.
AI is fundamentally reshaping software development. From design patterns to architecture choices, AI capabilities are introducing new paradigms that augment or even replace traditional approaches. This transformation is evident in how we design systems, plan solutions, and build features. In this upcoming series of articles, I’ll explore the emergence of new software design patterns, broader changes in solution design, and the exploration of solutions once out of reach using traditional software development. Can software still be “debugged” when outputs are not deterministic? What does software reliability look like when outputs are probabilistic? How do we ensure accountability when AI-driven software makes decisions?
The Tyranny of the Delete Key
Joining a letter exchange group leads to the broader realization that sometimes, the path to better creative output leads through deliberate inefficiency.
The process of writing, or any creative pursuit, is as varied as the human experience itself, with an infinite number of paths from a blank page to a finished piece. An unexpected truth I’ve uncovered through the evolution of my process is that the path to better writing leads through deliberate inefficiency.
This realization came from an unlikely source: joining a letter exchange group. As I began exchanging handwritten letters with members from around the world, I noticed an interesting trend – my letters possessed a clarity of thought that eluded the rest of my writing. What began as a curious affectation became a valuable lesson in writing.
I now begin every first draft on paper using a fountain pen or typewriter. Only until I begin editing do I digitize my work. Ironically, adding a specific amount of friction to the writing experience improves my output. Internalizing the idea that you can’t easily modify what you’ve written prompts deeper consideration before writing. This mental “pre-writing” results in more deliberate sentence construction and a stronger logical link between sentences. After all, clear writing only results from clear thinking.
It turns out I had been solving the wrong problem. The efficiency in writing isn’t the speed, it’s the quality of thought. Removing the safety net of the delete key forces more profound thought and, ultimately, more effective communication through fewer drafts.
Prompt Engineering: Art and Science
Effective prompt engineering is an art as much as it is a science. Programmers can ensure quality LLM output in their apps by following established prompting frameworks.
As artificial intelligence becomes more deeply integrated into business operations, the art and science of prompt engineering is emerging as an essential skill among knowledge workers. Understanding how to get the most out of large language models will quickly become a competitive differentiator that gives these employees a significant edge in the workplace. As AI adoption accelerates, businesses will increasingly invest in and value prompt engineering expertise.
Large language models are highly probabilistic. Given the same prompt, the model might not always produce the same response, especially when randomness is introduced through parameters like temperature and top-k sampling. While this probabilistic nature helps generate diverse and creative outputs, many business use cases require consistency, reliability, and precision. Sound prompt engineering does not eliminate AI’s probabilistic nature but strategically narrows the range of outputs, making responses more predictable and valuable for structured applications.
COSTAR
The COSTAR prompt framework provides a structured approach to prompting that ensures the key data points that influence an LLM’s response are provided to the model:
- Context: Provide background information that helps the LLM understand the specific scenario.
- Objective: Clearly defining the tasks focuses the LLM’s output.
- Style: What writing style should the response have?
- Tone: What tone should the response have (motivational, friendly, etc.)?
- Audience: Who is using the LLM
- Response: Provide a specific response format (text, JSON, etc.).
Below is an example of a system prompt for a summarization analysis assistant:
# CONTEXT
You are a precision-focused text analysis system designed to evaluate summary accuracy. You analyze both the original text and its summary to determine how well the summary captures the essential information and meaning of the source material.
# OBJECTIVE
Compre an original text with its summary to:
1. Calculate a similarity score between 0.00 and 1.00 (where 1.00 represents perfect accuracy)
2. Provide clear reasoning for the score
3. Identify specific elements that influenced the scoring
# STYLE
Clear, precise, and analytical, focusing on concrete examples from both texts to support the evaluation.
# TONE
Objective and factual, like a scientific measurement tool.
# AUDIENCE
Users who need quantitative and qualitative assessment of summary accuracy, requiring specific numerical feedback.
# RESPONSE FORMAT
Output should be structured as follows:
1. Accuracy Score: [0.00-1.00]
2. Score Explanation:
- Key factors that raised the score
- Key factors that lowered the score
- Specific examples from both texts to support the assessment
3. Brief conclusion summarizing the main reasons for the final score
**NOTE:** Always maintain score precision to two decimal places (e.g., 0.87, 0.45, 0.92)
Structured Outputs
Our example above leaves the exact response format up to the model. This strategy works well for a text-based chatbot, but what if we want to use the API to retrieve data that our application will consume? Any break in the expected format will result in a parsing error and cause our program to throw an exception. Defining an output structure for the model provides two main advantages:
- Type-safety: Validation of response format and data type are not required.
- Simplified Prompting: No need to precisely explain data formats and/or provide examples to ensure proper response format.
I created an object named accuracy_score
with three properties, each representing one of our requested outputs.
{
"name": "accuracy_score",
"schema": {
"type": "object",
"properties": {
"score": {
"type": "number",
"description": "The accuracy score as a float ranging from 0.00 to 1.00."
},
"score_explanation": {
"type": "string",
"description": "A description or explanation of the accuracy score."
},
"conclusion": {
"type": "string",
"description": "A concluding statement based on the accuracy score."
}
},
"required": [
"score",
"score_explanation",
"conclusion"
],
"additionalProperties": false
},
"strict": true
}
I can easily reference my schema within my application by defining a response format sent with each request. Any request referencing my response format is now guaranteed to be correct in type and format. My app can always rely on accurate data when retrieving the values of the score
, score_explanation
, and conclusion
properties.
response_format: { "type": "json_schema", "json_schema": … , "strict": true }
Apple is Missing the AI Race
Apple is failing to implement artificial intelligence in a way that plays to their greatest strengths.
Apple has two major advantages in the AI race. Their ARM-based SoC’s unified memory architecture allows the GPU and Neural Engine access to far more RAM than competitors. This architectural advantage allows excellent performance of smaller models running on device as the context window grows. Each token requires a key/value pair, which causes the memory footprint to quickly grow as individual conversations get longer. Apple is not taking advantage of their most valuable resource, the platform advantage – access to all of my personal data.
Mark Gurman at Bloomberg reports:
“The goal is to ultimately offer a more versatile Siri that can seamlessly tap into customers’ information and communication. For instance, users will be able to ask for a file or song that they discussed with a friend over text. Siri would then automatically retrieve that item. Apple also has demonstrated the ability for Siri to quickly locate someone’s driver’s license number by reviewing their photos.”
This is Apple’s competitive differentiator and where Apple should have focused its resources from the start. Why can't I ask questions about my archived email or find correlations in exercise volume and sleep quality within the Health app?
Apple’s real AI advantage isn’t just hardware — it’s the platform. A company that prides itself on tight integration across devices should be leading in AI that understands me. The ability to surface insights from my personal data, securely and privately, is where Apple could create the most compelling user experience.
The Winds of Time
The desert, rolling hills of sand, and the subtle movement of time.
I always thought of the desert as a place of stillness – the rolling hills of sand had existed a million years ago and will still be there long after we're gone. Despite its tranquil appearance, the desert is a dynamic environment constantly reshaped by the powerful forces of nature. The wind shapes the desert's landscape in an ongoing process of erosion and deposition, pushing sand into sweeping dunes that grow, shrink, and shift over time. The desert is a paradox – an unyielding and timeless yet ever-changing place.
As a young adult, time seemed to stretch endlessly ahead. The places and people that brought comfort appeared as constants in the world, but like the desert wind shaping its landscape, invisible forces create subtle movement under our feet. Small shifts that are easy to miss at the moment begin to silently build with time. In many ways, life is an exercise in maintaining your grasp on the things you value most. If you wait too long, you may find that the winds of time have carried them away beyond the horizon.
Life Lessons from Inner Voices
On the interplay of melodies and inner voices, the profound complexity beneath the surface of our lives, and slowing down to listen.
One of the most profound compliments I ever received was during my first year in college as a music major. I had just performed Robert Schumann’s “Aufschwung” from his Fantasiestücke, Op. 12. An older gentleman approached me and said he had never liked the piece, but my performance finally showed him the beauty of it. It was because I brought out something in the music absent from most performances: the interplay of inner voices with the melody. The thoughtful voicing of these countermelodies is inexplicably absent from the vast majority of professional recordings. Yet it is this very detail that breathes life into the score. Below are two examples of countermelodies often overlooked.
Schumann - Aufschwung
Chopin - Étude Op. 25, No. 1 in A-flat major
Inner voices are whispers that tell a subtle story beneath the surface of the primary narrative. Though not always at the forefront, these inner voices lend a subtle beauty to the music, much like the nuances that bring interest and complexity to life. They do not demand attention but invite it – moments of quiet beauty that reveal themselves only to those who take the time to look and listen. Their absence would leave the experience hollow and incomplete. Life’s most meaningful moments are often made so by their subtleties.
Inner voices can’t work alone – they exist in dialogue with the melody, providing harmony and support. They’re the close friend who listens without judgment, the colleague who quietly ensures a project’s success or a family member who offers unconditional love. These roles aren’t in the spotlight and may not draw applause, yet they’re indispensable to our success. They don’t leap out at you; they invite you to lean in, to notice the interplay of lines and the complexity beneath the surface. This act of focused listening mirrors how we best approach life: slowing down and paying attention to the details that enrich our experiences.
Finding Beauty in a Descending Line
The intersection of tides, the rhythm of life, Bach, and personal exploration.
Bach’s descending bass line gives life to a masterpiece.
One of the few perks of getting older is gaining a measure of perspective on life by recognizing patterns over time. There is a tide to existence that ebbs and flows—a cyclical order in an otherwise random world. When life’s tide pulls back, when what once felt sure and steady fades—leaving behind scattered debris and jagged rocks—it’s easy to believe that the tide won’t return, that progress is lost to the gravitational pull of a dark mass. Despite my successes, this year felt like an accumulation of challenges and setbacks leading to an uncertain future.
I am comforted by the wonders only visible during these times—hidden terrain we never see when the waters are high, and just as the waves grind rocks into sand, I find myself stopping to appreciate the feeling of resilience that keeps me moving through life’s rhythm. I find comfort in classical music, which speaks something profound about the human experience. Notes fall like drops of rain, managing to soothe despite their descent. As I listen, I realize that these notes don’t fall into despair; they gracefully descend as if to tell their own story of resilience. Their decay is inevitable—each note must give way for the splendor to come.
Life isn’t a quest to a destination but an exploration of the unknown. Sometimes you find treasure, sometimes you find wonder in the mundane, and sometimes you must simply endure the journey with the curiosity of what you’ll discover next. Keep your eyes, mind, and heart open.
New Mexico
Breathtaking landscapes, sanctity in scarcity, art, and humanity’s impact on nature. The transformative power of place on an artist’s work and spirit, a reminder of how the land and its beauty can become a lifelong passion.
A view from the entrance to Ghost Ranch
The breathtaking landscapes here have a story to tell. The land is marked with a profound history of violence and oppression that the Indigenous people of the area endured, first at the hands of Spanish colonizers and later by American settlers. This conflict shaped the area into what it is today, but it’s still grappling with this painful history.
Through centuries of colonization, Indigenous peoples in New Mexico faced repeated attempts to erase their cultures, seize their lands, and destroy their communities. Today, the resilience of these communities is a testament to their strength and resolve.
New Mexico feels alive and dynamic, unlike many places I’ve visited. Even the dead, sun-bleached trees gesture as if they’re desperately trying to communicate something. Each winding trail beckons with the promise of discovery. Here, religion is not confined to churches; it breathes in the adobe walls and can be seen in the painted skies as the sun sets against the distant mesas. I can now appreciate why the indigenous people of the area felt the land itself is sacred.
There is a symbiotic relationship between nature and civilization. The adobe structures, made from earth, blend into the landscape. Here, there is little distinction between indoors and the expansive outdoor vistas around every corner.
Storms rolling into Ghost Ranch
Religion in New Mexico is a complex mix of Native American, Spanish, and Catholic influences shaped by brutal colonization and cultural violence against the Indigenous Peoples of the area. With the arrival of Spanish colonizers in the 16th century, Catholicism was forcibly introduced, and this period marked the beginning of a brutal colonization process, where Indigenous beliefs were suppressed.
Loretto Chapel - The "Miraculous Staircase"
Cathedral Basilica of St. Francis of Assisi
Loretto Chapel
Cathedral Basilica of St. Francis of Assisi
Cathedral Basilica of St. Francis of Assisi
San Miguel Chapel
Cross of the Martyrs
Georgia O’Keeffe’s relationship with New Mexico transformed both her art and her legacy, creating a deep connection between her work and the American Southwest. New Mexico itself has become synonymous with O’Keeffe’s vision, with artists from all over the world continuing to draw inspiration from the same vistas she immortalized in her paintings. Her ability to find beauty in the barren and the overlooked helped shape the artistic narrative of the Southwest, making her not only an artist of international renown but an inseparable part of New Mexico’s cultural fabric. O’Keeffe’s work in New Mexico serves as a testament to the transformative power of place on an artist’s work and spirit, a reminder of how the land and its beauty can become a lifelong passion.
O'Keeffe's home in Abiquiu
The front of O'Keeffe's Abiquiu home
A ladder O'Keeffe climbed to watch the stars at night
Long ago, Native Americans understood the sanctity in scarcity, where the natural world is revered, and every resource, no matter how abundant or scarce, is treated with respect and gratitude. This profound relationship with nature, and the acknowledgment of its finite offerings, reflects a worldview that sees life as interconnected, where scarcity enhances the spiritual significance of the natural elements that sustain life. Our modern culture ignores this reality at our own peril.
“Only after the last tree has been cut down, after the last river has been poisoned, after the last fish has been caught, only then will you find that money cannot be eaten.”
On the evening of my last night in New Mexico, as the sun began to sink behind the distant hills and the moon rose, I stood in quiet reflection. It was no longer an abstract concept or a distant warning. In that moment, it became a visceral truth. The earth, in all its beauty and fragility, is not something to be conquered or commodified. It is not a resource to be exploited until exhaustion. It is sacred—our source of life.
The Score and the Performance
The unlikely convergence of Ansel Adams, rubato, and a coffee advertisement.
“The negative is the score, and the print the performance.”
Photography isn’t just about capturing an event – it is a visual communication medium. The concept of photo editing has gained an increasingly negative reputation as digital editing software became powerful and ubiquitous in the last decade. It has now sparked fascination and controversy as we step into a new era of computational photography and AI manipulation. A novice can now do what used to take great skill and time to accomplish in minutes. These technologies can be a force of progress by opening new creative avenues to express ideas, but an overemphasis on them can cause the artistic message to be lost. In extreme scenarios, photography can become digital art, no longer bearing any similarity to the original image.
An Artistic Parallel
Drawing a parallel to the concept of rubato in music, where subtle deviations from strict tempo create a more expressive and emotional performance, photographers must exercise a careful balance in their editing decisions. Just as a musician must be mindful not to stretch the tempo too far, photographers should avoid excessive alterations that compromise the authenticity of their work. The key is to use manipulation as a means to enhance, not overshadow, the inherent beauty of the captured moment.
As with any art form, the key lies in the delicate balance between creative freedom and a respectful acknowledgment of the authenticity inherent in the captured moment.
An Example
Discussing nuanced topics in the abstract is often convenient, but a concrete example can often express the thought more clearly. The image on the left (or on top, for mobile readers) is a straight print of a local building. Aside from the lack of color, it’s a literal representation of the light projected through the lens. The second image is my finished print. The darkened sky draws the viewer’s eye toward the advertisement painted on the brick and adds overall contrast, creating a sense of drama in an otherwise mundane subject. A gentle lift of the shadows on the front of the building helps to reveal more painted brick. I did not lift the shadows on the lower side of the building to retain negative space, reinforcing the idea that the subject is not the entire building but the old advertisement. None of these adjustments fundamentally change the image – they merely serve to draw emphasis and guide the viewer’s eye to where I wanted it.
A Lesson of Death and Beauty
A bad bottle of Bordeaux teaches a profound lesson.
I have a love/hate relationship with vintage wine, but the very traits I have come to hate are also the source of my passion. These opposite yet interconnected forces, this frustrating duality, came into focus when a Sommelier recently opened one of my bottles of 1981 vintage Chateau Leoville Las Cases Bordeaux.
This bottle had come to the end of a long journey. Forty-two years ago, the vineyard's grapes were carefully tended to over an entire growing season, hand-picked, sorted, and processed. Some of the hands that picked the grapes likely belong to people who have since passed. These grapes were survivors of the deluge of rain that consumed the first half of October that year. The bottle was cellared for decades in a temperature-controlled environment by multiple owners. There were thousands of opportunities for a mishap, but there it was forty-two years later, sitting on the bar of my favorite local restaurant. Cutting the foil wrapper revealed a white powdery substance overtaking the liquid-soaked cork, an ominous foreshadowing of what would come. The wine exhibited an initial hint of mustiness with a short, funk-laced whisper of cassis. It was the taste of oenological expiration. At an unknown time within the last four decades, the wine had died.
My reaction was not disappointment or irritation but a general sense of loss. This wine was painstakingly crafted by a team of passionate people for the purpose of bringing joy, and it never had the chance to realize this goal. Instead, it served as an austere reminder of time's relentless flow, a poignant lesson not to squander our singular opportunity to bring a measure of joy to those around us, and a warning of the precious immediacy to life. Time is slowly consuming us all, and like this bottle of wine, we have but one chance to leave our mark.
Perhaps this is the very source of my passion for wine. Even many of our happiest moments are laced with a sense of melancholy because we know it can't last forever. The emotional power is drawn from this very duality because it's the contrast of one that provides vibrance for the other – light and shadow, life and death.