Integrity
Write
Loading...
Amelie Carver

Amelie Carver

3 years ago

Web3 Needs More Writers to Educate Us About It

More on Web3 & Crypto

CyberPunkMetalHead

CyberPunkMetalHead

2 years ago

It's all about the ego with Terra 2.0.

UST depegs and LUNA crashes 99.999% in a fraction of the time it takes the Moon to orbit the Earth.

Fat Man, a Terra whistle-blower, promises to expose Do Kwon's dirty secrets and shady deals.

The Terra community has voted to relaunch Terra LUNA on a new blockchain. The Terra 2.0 Pheonix-1 blockchain went live on May 28, 2022, and people were airdropped the new LUNA, now called LUNA, while the old LUNA became LUNA Classic.

Does LUNA deserve another chance? To answer this, or at least start a conversation about the Terra 2.0 chain's advantages and limitations, we must assess its fundamentals, ideology, and long-term vision.

Whatever the result, our analysis must be thorough and ruthless. A failure of this magnitude cannot happen again, so we must magnify every potential breaking point by 10.

Will UST and LUNA holders be compensated in full?

The obvious. First, and arguably most important, is to restore previous UST and LUNA holders' bags.

Terra 2.0 has 1,000,000,000,000 tokens to distribute.

  • 25% of a community pool

  • Holders of pre-attack LUNA: 35%

  • 10% of aUST holders prior to attack

  • Holders of LUNA after an attack: 10%

  • UST holders as of the attack: 20%

Every LUNA and UST holder has been compensated according to the above proposal.

According to self-reported data, the new chain has 210.000.000 tokens and a $1.3bn marketcap. LUNC and UST alone lost $40bn. The new token must fill this gap. Since launch:

LUNA holders collectively own $1b worth of LUNA if we subtract the 25% community pool airdrop from the current market cap and assume airdropped LUNA was never sold.

At the current supply, the chain must grow 40 times to compensate holders. At the current supply, LUNA must reach $240.

LUNA needs a full-on Bull Market to make LUNC and UST holders whole.

Who knows if you'll be whole? From the time you bought to the amount and price, there are too many variables to determine if Terra can cover individual losses.

The above distribution doesn't consider individual cases. Terra didn't solve individual cases. It would have been huge.

What does LUNA offer in terms of value?

UST's marketcap peaked at $18bn, while LUNC's was $41bn. LUNC and UST drove the Terra chain's value.

After it was confirmed (again) that algorithmic stablecoins are bad, Terra 2.0 will no longer support them.

Algorithmic stablecoins contributed greatly to Terra's growth and value proposition. Terra 2.0 has no product without algorithmic stablecoins.

Terra 2.0 has an identity crisis because it has no actual product. It's like Volkswagen faking carbon emission results and then stopping car production.

A project that has already lost the trust of its users and nearly all of its value cannot survive without a clear and in-demand use case.

Do Kwon, how about him?

Oh, the Twitter-caller-poor? Who challenges crypto billionaires to break his LUNA chain? Who dissolved Terra Labs South Korea before depeg? Arrogant guy?

That's not a good image for LUNA, especially when making amends. I think he should step down and let a nicer person be Terra 2.0's frontman.

The verdict

Terra has a terrific community with an arrogant, unlikeable leader. The new LUNA chain must grow 40 times before it can start making up its losses, and even then, not everyone's losses will be covered.

I won't invest in Terra 2.0 or other algorithmic stablecoins in the near future. I won't be near any Do Kwon-related project within 100 miles. My opinion.

Can Terra 2.0 be saved? Comment below.

Protos

Protos

3 years ago

StableGains lost $42M in Anchor Protocol.

StableGains lost millions of dollars in customer funds in Anchor Protocol without telling its users. The Anchor Protocol offered depositors 19-20% APY before its parent ecosystem, Terra LUNA, lost tens of billions of dollars in market capitalization as LUNA fell below $0.01 and its stablecoin (UST) collapsed.

A Terra Research Forum member raised the alarm. StableGains changed its homepage and Terms and Conditions to reflect how it mitigates risk, a tacit admission that it should have done so from the start.

StableGains raised $600,000 in YCombinator's W22 batch. Moonfire, Broom Ventures, and Goodwater Capital invested $3 million more.

StableGains' 15% yield product attracted $42 million in deposits. StableGains kept most of its deposits in Anchor's UST pool earning 19-20% APY, kept one-quarter of the interest as a management fee, and then gave customers their promised 15% APY. It lost almost all customer funds when UST melted down. It changed withdrawal times, hurting customers.

  • StableGains said de-pegging was unlikely. According to its website, 1 UST can be bought and sold for $1 of LUNA. LUNA became worthless, and Terra shut down its blockchain.
  • It promised to diversify assets across several stablecoins to reduce the risk of one losing its $1 peg, but instead kept almost all of them in one basket.
  • StableGains promised withdrawals in three business days, even if a stablecoin needed time to regain its peg. StableGains uses Coinbase for deposits and withdrawals, and customers receive the exact amount of USDC requested.

StableGains scrubs its website squeaky clean

StableGains later edited its website to say it only uses the "most trusted and tested stablecoins" and extended withdrawal times from three days to indefinite time "in extreme cases."

Previously, USDC, TerraUST (UST), and Dai were used (DAI). StableGains changed UST-related website content after the meltdown. It also removed most references to DAI.

Customers noticed a new clause in the Terms and Conditions denying StableGains liability for withdrawal losses. This new clause would have required customers to agree not to sue before withdrawing funds, avoiding a class-action lawsuit.


Customers must sign a waiver to receive a refund.

Erickson Kramer & Osborne law firm has asked StableGains to preserve all internal documents on customer accounts, marketing, and TerraUSD communications. The firm has not yet filed a lawsuit.


Thousands of StableGains customers lost an estimated $42 million.

Celsius Network customers also affected

CEL used Terra LUNA's Anchor Protocol. Celsius users lost money in the crypto market crash and UST meltdown. Many held CEL and LUNA as yielding deposits.

CEO Alex Mashinsky accused "unknown malefactors" of targeting Celsius Network without evidence. Celsius has not publicly investigated this claim as of this article's publication.

CEL fell before UST de-pegged. On June 2, 2021, it reached $8.01. May 19's close: $0.82.

When some Celsius Network users threatened to leave over token losses, Mashinsky replied, "Leave if you don't think I'm sincere and working harder than you, seven days a week."

Celsius Network withdrew $500 million from Anchor Protocol, but smaller holders had trouble.

Read original article here

Modern Eremite

Modern Eremite

3 years ago

The complete, easy-to-understand guide to bitcoin

Introduction

Markets rely on knowledge.

The internet provided practically endless knowledge and wisdom. Humanity has never seen such leverage. Technology's progress drives us to adapt to a changing world, changing our routines and behaviors.

In a digital age, people may struggle to live in the analogue world of their upbringing. Can those who can't adapt change their lives? I won't answer. We should teach those who are willing to learn, nevertheless. Unravel the modern world's riddles and give them wisdom.

Adapt or die . Accept the future or remain behind.

This essay will help you comprehend Bitcoin better than most market participants and the general public. Let's dig into Bitcoin.

Join me.

Ascension

Bitcoin.org was registered in August 2008. Bitcoin whitepaper was published on 31 October 2008. The document intrigued and motivated people around the world, including technical engineers and sovereignty seekers. Since then, Bitcoin's whitepaper has been read and researched to comprehend its essential concept.

I recommend reading the whitepaper yourself. You'll be able to say you read the Bitcoin whitepaper instead of simply Googling "what is Bitcoin" and reading the fundamental definition without knowing the revolution's scope. The article links to Bitcoin's whitepaper. To avoid being overwhelmed by the whitepaper, read the following article first.

Bitcoin isn't the first peer-to-peer digital currency. Hashcash or Bit Gold were once popular cryptocurrencies. These two Bitcoin precursors failed to gain traction and produce the network effect needed for general adoption. After many struggles, Bitcoin emerged as the most successful cryptocurrency, leading the way for others.

Satoshi Nakamoto, an active bitcointalk.org user, created Bitcoin. Satoshi's identity remains unknown. Satoshi's last bitcointalk.org login was 12 December 2010. Since then, he's officially disappeared. Thus, conspiracies and riddles surround Bitcoin's creators. I've heard many various theories, some insane and others well-thought-out.

It's not about who created it; it's about knowing its potential. Since its start, Satoshi's legacy has changed the world and will continue to.

Block-by-block blockchain

Bitcoin is a distributed ledger. What's the meaning?

Everyone can view all blockchain transactions, but no one can undo or delete them.

Imagine you and your friends routinely eat out, but only one pays. You're careful with money and what others owe you. How can everyone access the info without it being changed?

You'll keep a notebook of your evening's transactions. Everyone will take a page home. If one of you changed the page's data, the group would notice and reject it. The majority will establish consensus and offer official facts.

Miners add a new Bitcoin block to the main blockchain every 10 minutes. The appended block contains miner-verified transactions. Now that the next block has been added, the network will receive the next set of user transactions.

Bitcoin Proof of Work—prove you earned it

Any firm needs hardworking personnel to expand and serve clients. Bitcoin isn't that different.

Bitcoin's Proof of Work consensus system needs individuals to validate and create new blocks and check for malicious actors. I'll discuss Bitcoin's blockchain consensus method.

Proof of Work helps Bitcoin reach network consensus. The network is checked and safeguarded by CPU, GPU, or ASIC Bitcoin-mining machines (Application-Specific Integrated Circuit).

Every 10 minutes, miners are rewarded in Bitcoin for securing and verifying the network. It's unlikely you'll finish the block. Miners build pools to increase their chances of winning by combining their processing power.

In the early days of Bitcoin, individual mining systems were more popular due to high maintenance costs and larger earnings prospects. Over time, people created larger and larger Bitcoin mining facilities that required a lot of space and sophisticated cooling systems to keep machines from overheating.

Proof of Work is a vital part of the Bitcoin network, as network security requires the processing power of devices purchased with fiat currency. Miners must invest in mining facilities, which creates a new business branch, mining facilities ownership. Bitcoin mining is a topic for a future article.

More mining, less reward

Bitcoin is usually scarce.

Why is it rare? It all comes down to 21,000,000 Bitcoins.

Were all Bitcoins mined? Nope. Bitcoin's supply grows until it hits 21 million coins. Initially, 50BTC each block was mined, and each block took 10 minutes. Around 2140, the last Bitcoin will be mined.

But 50BTC every 10 minutes does not give me the year 2140. Indeed careful reader. So important is Bitcoin's halving process.

What is halving?

The block reward is halved every 210,000 blocks, which takes around 4 years. The initial payout was 50BTC per block and has been decreased to 25BTC after 210,000 blocks. First halving occurred on November 28, 2012, when 10,500,000 BTC (50%) had been mined. As of April 2022, the block reward is 6.25BTC and will be lowered to 3.125BTC by 19 March 2024.

The halving method is tied to Bitcoin's hashrate. Here's what "hashrate" means.

What if we increased the number of miners and hashrate they provide to produce a block every 10 minutes? Wouldn't we manufacture blocks faster?

Every 10 minutes, blocks are generated with little asymmetry. Due to the built-in adaptive difficulty algorithm, the overall hashrate does not affect block production time. With increased hashrate, it's harder to construct a block. We can estimate when the next halving will occur because 10 minutes per block is fixed.

Building with nodes and blocks

For someone new to crypto, the unusual terms and words may be overwhelming. You'll also find everyday words that are easy to guess or have a vague idea of what they mean, how they work, and what they do. Consider blockchain technology.

Nodes and blocks: Think about that for a moment. What is your first idea?

The blockchain is a chain of validated blocks added to the main chain. What's a "block"? What's inside?

The block is another page in the blockchain book that has been filled with transaction information and accepted by the majority.

We won't go into detail about what each block includes and how it's built, as long as you understand its purpose.

What about nodes?

Nodes, along with miners, verify the blockchain's state independently. But why?

To create a full blockchain node, you must download the whole Bitcoin blockchain and check every transaction against Bitcoin's consensus criteria.

What's Bitcoin's size? 

In April 2022, the Bitcoin blockchain was 389.72GB.

Bitcoin's blockchain has miners and node runners.

Let's revisit the US gold rush. Miners mine gold with their own power (physical and monetary resources) and are rewarded with gold (Bitcoin). All become richer with more gold, and so does the country.

Nodes are like sheriffs, ensuring everything is done according to consensus rules and that there are no rogue miners or network users.

Lost and held bitcoin

Does the Bitcoin exchange price match each coin's price? How many coins remain after 21,000,000? 21 million or less?

Common reason suggests a 21 million-coin supply.

What if I lost 1BTC from a cold wallet?

What if I saved 1000BTC on paper in 2010 and it was damaged?

What if I mined Bitcoin in 2010 and lost the keys?

Satoshi Nakamoto's coins? Since then, those coins haven't moved.

How many BTC are truly in circulation?

Many people are trying to answer this question, and you may discover a variety of studies and individual research on the topic. Be cautious of the findings because they can't be evaluated and the statistics are hazy guesses.

On the other hand, we have long-term investors who won't sell their Bitcoin or will sell little amounts to cover mining or living needs.

The price of Bitcoin is determined by supply and demand on exchanges using liquid BTC. How many BTC are left after subtracting lost and non-custodial BTC? 

We have significantly less Bitcoin in circulation than you think, thus the price may not reflect demand if we knew the exact quantity of coins available.

True HODLers and diamond-hand investors won't sell you their coins, no matter the market.

What's UTXO?

Unspent (U) Transaction (TX) Output (O)

Imagine taking a $100 bill to a store. After choosing a drink and munchies, you walk to the checkout to pay. The cashier takes your $100 bill and gives you $25.50 in change. It's in your wallet.

Is it simply 100$? No way.

The $25.50 in your wallet is unrelated to the $100 bill you used. Your wallet's $25.50 is just bills and coins. Your wallet may contain these coins and bills:

2x 10$ 1x 10$

1x 5$ or 3x 5$

1x 0.50$ 2x 0.25$

Any combination of coins and bills can equal $25.50. You don't care, and I'd wager you've never ever considered it.

That is UTXO. Now, I'll detail the Bitcoin blockchain and how UTXO works, as it's crucial to know what coins you have in your (hopefully) cold wallet.

You purchased 1BTC. Is it all? No. UTXOs equal 1BTC. Then send BTC to a cold wallet. Say you pay 0.001BTC and send 0.999BTC to your cold wallet. Is it the 1BTC you got before? Well, yes and no. The UTXOs are the same or comparable as before, but the blockchain address has changed. It's like if you handed someone a wallet, they removed the coins needed for a network charge, then returned the rest of the coins and notes.

UTXO is a simple concept, but it's crucial to grasp how it works to comprehend dangers like dust attacks and how coins may be tracked.

Lightning Network: fast cash

You've probably heard of "Layer 2 blockchain" projects.

What does it mean?

Layer 2 on a blockchain is an additional layer that increases the speed and quantity of transactions per minute and reduces transaction fees.

Imagine going to an obsolete bank to transfer money to another account and having to pay a charge and wait. You can transfer funds via your bank account or a mobile app without paying a fee, or the fee is low, and the cash appear nearly quickly. Layer 1 and 2 payment systems are different.

Layer 1 is not obsolete; it merely has more essential things to focus on, including providing the blockchain with new, validated blocks, whereas Layer 2 solutions strive to offer Layer 1 with previously processed and verified transactions. The primary blockchain, Bitcoin, will only receive the wallets' final state. All channel transactions until shutting and balancing are irrelevant to the main chain.

Layer 2 and the Lightning Network's goal are now clear. Most Layer 2 solutions on multiple blockchains are created as blockchains, however Lightning Network is not. Remember the following remark, as it best describes Lightning.

Lightning Network connects public and private Bitcoin wallets.

Opening a private channel with another wallet notifies just two parties. The creation and opening of a public channel tells the network that anyone can use it.

Why create a public Lightning Network channel?

Every transaction through your channel generates fees.

Money, if you don't know.

See who benefits when in doubt.

Anonymity, huh?

Bitcoin anonymity? Bitcoin's anonymity was utilized to launder money.

Well… You've heard similar stories. When you ask why or how it permits people to remain anonymous, the conversation ends as if it were just a story someone heard.

Bitcoin isn't private. Pseudonymous.

What if someone tracks your transactions and discovers your wallet address? Where is your anonymity then?

Bitcoin is like bulletproof glass storage; you can't take or change the money. If you dig and analyze the data, you can see what's inside.

Every online action leaves a trace, and traces may be tracked. People often forget this guideline.

A tool like that can help you observe what the major players, or whales, are doing with their coins when the market is uncertain. Many people spend time analyzing on-chain data. Worth it?

Ask yourself a question. What are the big players' options?  Do you think they're letting you see their wallets for a small on-chain data fee?

Instead of short-term behaviors, focus on long-term trends.

More wallet transactions leave traces. Having nothing to conceal isn't a defect. Can it lead to regulating Bitcoin so every transaction is tracked like in banks today?

But wait. How can criminals pay out Bitcoin? They're doing it, aren't they?

Mixers can anonymize your coins, letting you to utilize them freely. This is not a guide on how to make your coins anonymous; it could do more harm than good if you don't know what you're doing.

Remember, being anonymous attracts greater attention.

Bitcoin isn't the only cryptocurrency we can use to buy things. Using cryptocurrency appropriately can provide usability and anonymity. Monero (XMR), Zcash (ZEC), and Litecoin (LTC) following the Mimblewimble upgrade are examples.

Summary

Congratulations! You've reached the conclusion of the article and learned about Bitcoin and cryptocurrency. You've entered the future.

You know what Bitcoin is, how its blockchain works, and why it's not anonymous. I bet you can explain Lightning Network and UTXO to your buddies.

Markets rely on knowledge. Prepare yourself for success before taking the first step. Let your expertise be your edge.


This article is a summary of this one.

You might also like

Chris Newman

Chris Newman

3 years ago

Clean Food: Get Over Yourself If You Want to Save the World.

From Salt Bae, via Facebook

I’m a permaculture farmer. I want to create food-producing ecosystems. My hope is a world with easy access to a cuisine that nourishes consumers, supports producers, and leaves the Earth joyously habitable.

Permaculturists, natural farmers, plantsmen, and foodies share this ambition. I believe this group of green thumbs, stock-folk, and food champions is falling to tribalism, forgetting that rescuing the globe requires saving all of its inhabitants, even those who adore cheap burgers and Coke. We're digging foxholes and turning folks who disagree with us or don't understand into monsters.

Take Dr. Daphne Miller's comments at the end of her Slow Money Journal interview:

“Americans are going to fall into two camps when all is said and done: People who buy cheap goods, regardless of quality, versus people who are willing and able to pay for things that are made with integrity. We are seeing the limits of the “buying cheap crap” approach.”

This is one of the most judgmental things I've read outside the Bible. Consequences:

  • People who purchase inexpensive things (food) are ignorant buffoons who prefer to choose fair trade coffee over fuel as long as the price is correct.

  • It all depends on your WILL to buy quality or cheaply. Both those who are WILLING and those who ARE NOT exist. And able, too.

  • People who are unwilling and unable are purchasing garbage. You're giving your kids bad food. Both the Earth and you are being destroyed by your actions. Your camp is the wrong one. You’re garbage! Disgrace to you.

Dr. Miller didn't say it, but words are worthless until interpreted. This interpretation depends on the interpreter's economic, racial, political, religious, family, and personal history. Complementary language insults another. Imagine how that Brown/Harvard M.D.'s comment sounds to a low-income household with no savings.

This just went from “cheap burger” to “political statement of blue-collar solidarity.” Thanks, Clean Food, for digging your own grave.

Dr. Miller's comment reflects the echo chamber into which nearly all clean food advocates speak. It asks easy questions and accepts non-solutions like raising food prices and eating less meat. People like me have cultivated an insular world unencumbered by challenges beyond the margins. We may disagree about technical details in rotationally-grazing livestock, but we short circuit when asked how our system could supply half the global beef demand. Most people have never seriously considered this question. We're so loved and affirmed that challenging ourselves doesn't seem necessary. Were generals insisting we don't need to study the terrain because God is on our side?

“Yes, the $8/lb ground beef is produced the way it should be. Yes, it’s good for my body. Yes it’s good for the Earth. But it’s eight freaking dollars, and my kid needs braces and protein. Bye Felicia, we’re going to McDonald’s.”

-Bobby Q. Homemaker

Funny clean foodies. People don't pay enough for food; they should value it more. Turn the concept of buying food with integrity into a wedge and drive it into the heart of America, dividing the willing and unwilling.

We go apeshit if you call our products high-end.

I've heard all sorts of gaslighting to defend a $10/lb pork chop as accessible (things I’ve definitely said in the past):

  • At Whole Foods, it costs more.

  • The steak at the supermarket is overly affordable.

  • Pay me immediately or the doctor gets paid later.

I spoke with Timbercreek Market and Local Food Hub in front of 60 people. We were asked about local food availability.

They came to me last, after my co-panelists gave the same responses I would have given two years before.

I grumbled, "Our food is inaccessible." Nope. It's beyond the wallets of nearly everyone, and it's the biggest problem with sustainable food systems. We're criminally unserious about being leaders in sustainability until we propose solutions beyond economic relativism, wishful thinking, and insisting that vulnerable, distracted people do all the heavy lifting of finding a way to afford our food. And until we talk about solutions, all this preserve the world? False.

The room fell silent as if I'd revealed a terrible secret. Long, thunderous applause followed my other remarks. But I’m probably not getting invited back to any VNRLI events.

I make pricey cuisine. It’s high-end. I have customers who really have to stretch to get it, and they let me know it. They're forgoing other creature comforts to help me make a living and keep the Earth of my grandmothers alive, and they're doing it as an act of love. They believe in us and our work.

I remember it when I'm up to my shoulders in frigid water, when my vehicle stinks of four types of shit, when I come home covered in blood and mud, when I'm hauling water in 100-degree heat, when I'm herding pigs in a rainstorm and dodging lightning bolts to close the chickens. I'm reminded I'm not alone. Their enthusiasm is worth more than money; it helps me make a life and a living. I won't label that gift less than it is to make my meal seem more accessible.

Not everyone can sacrifice.

Let's not pretend we want to go back to peasant fare, despite our nostalgia. Industrial food has leveled what rich and poor eat. How food is cooked will be the largest difference between what you and a billionaire eat. Rich and poor have access to chicken, pork, and beef. You might be shocked how recently that wasn't the case. This abundance, particularly of animal protein, has helped vulnerable individuals.

Especially when the mutton’s nice and lean (image from The Spruce)

Industrial food causes environmental damage, chronic disease, and distribution inequities. Clean food promotes non-industrial, artisan farming. This creates a higher-quality, more expensive product than the competition; we respond with aggressive marketing and the "people need to value food more" shtick geared at consumers who can spend the extra money.

The guy who is NOT able is rendered invisible by clean food's elitist marketing, which is bizarre given a.) clean food insists it's trying to save the world, yet b.) MOST PEOPLE IN THE WORLD ARE THAT GUY. No one can help him except feel-good charities. That's crazy.

Also wrong: a foodie telling a kid he can't eat a 99-cent fast food hamburger because it lacks integrity. Telling him how easy it is to save his ducketts and maybe have a grass-fed house burger at the end of the month as a reward, but in the meantime get your protein from canned beans you can't bake because you don't have a stove and, even if you did, your mom works two jobs and moonlights as an Uber driver so she doesn't have time to heat that shitup anyway.

A wealthy person's attitude toward the poor is indecent. It's 18th-century Versailles.

“Let them eat cake. Oh, it’s not organic? Let them starve!”

Human rights include access to nutritious food without social or environmental costs. As a food-forest-loving permaculture farmer, I no longer balk at the concept of cultured beef and hydroponics. My food is out of reach for many people, but access to decent food shouldn't be. Cultures and hydroponics could scale to meet the clean food affordability gap without externalities. If technology can deliver great, affordable beef without environmental negative effects, I can't reject it because it's new, unusual, or might endanger my business.

Why is your farm needed if cultured beef and hydroponics can feed the world? Permaculture food forests with trees, perennial plants, and animals are crucial to economically successful environmental protection. No matter how advanced technology gets, we still need clean air, water, soil, greenspace, and food.

Clean Food cultivated in/on live soil, minimally processed, and eaten close to harvest is part of the answer, not THE solution. Clean food advocates must recognize the conflicts at the intersection of environmental, social, and economic sustainability, the disproportionate effects of those conflicts on the poor and lower-middle classes, and the immorality and impracticality of insisting vulnerable people address those conflicts on their own and judging them if they don't.

Our clients, relatives, friends, and communities need an honest assessment of our role in a sustainable future. If we're serious about preserving the world, we owe honesty to non-customers. We owe our goal and sanity to honesty. Future health and happiness of the world left to the average person's pocketbook and long-term moral considerations is a dismal proposition with few parallels.

Let's make soil and grow food. Let the lab folks do their thing. We're all interdependent.

Samer Buna

Samer Buna

2 years ago

The Errors I Committed As a Novice Programmer

Learn to identify them, make habits to avoid them

First, a clarification. This article is aimed to make new programmers aware of their mistakes, train them to detect them, and remind them to prevent them.

I learned from all these blunders. I'm glad I have coding habits to avoid them. Do too.

These mistakes are not ordered.

1) Writing code haphazardly

Writing good content is hard. It takes planning and investigation. Quality programs don't differ.

Think. Research. Plan. Write. Validate. Modify. Unfortunately, no good acronym exists. Create a habit of doing the proper quantity of these activities.

As a newbie programmer, my biggest error was writing code without thinking or researching. This works for small stand-alone apps but hurts larger ones.

Like saying anything you might regret, you should think before coding something you could regret. Coding expresses your thoughts.

When angry, count to 10 before you speak. If very angry, a hundred. — Thomas Jefferson.

My quote:

When reviewing code, count to 10 before you refactor a line. If the code does not have tests, a hundred. — Samer Buna

Programming is primarily about reviewing prior code, investigating what is needed and how it fits into the current system, and developing small, testable features. Only 10% of the process involves writing code.

Programming is not writing code. Programming need nurturing.

2) Making excessive plans prior to writing code

Yes. Planning before writing code is good, but too much of it is bad. Water poisons.

Avoid perfect plans. Programming does not have that. Find a good starting plan. Your plan will change, but it helped you structure your code for clarity. Overplanning wastes time.

Only planning small features. All-feature planning should be illegal! The Waterfall Approach is a step-by-step system. That strategy requires extensive planning. This is not planning. Most software projects fail with waterfall. Implementing anything sophisticated requires agile changes to reality.

Programming requires responsiveness. You'll add waterfall plan-unthinkable features. You will eliminate functionality for reasons you never considered in a waterfall plan. Fix bugs and adjust. Be agile.

Plan your future features, though. Do it cautiously since too little or too much planning can affect code quality, which you must risk.

3) Underestimating the Value of Good Code

Readability should be your code's exclusive goal. Unintelligible code stinks. Non-recyclable.

Never undervalue code quality. Coding communicates implementations. Coders must explicitly communicate solution implementations.

Programming quote I like:

Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live. — John Woods

John, great advice!

Small things matter. If your indentation and capitalization are inconsistent, you should lose your coding license.

Long queues are also simple. Readability decreases after 80 characters. To highlight an if-statement block, you might put a long condition on the same line. No. Just never exceed 80 characters.

Linting and formatting tools fix many basic issues like this. ESLint and Prettier work great together in JavaScript. Use them.

Code quality errors:

Multiple lines in a function or file. Break long code into manageable bits. My rule of thumb is that any function with more than 10 lines is excessively long.

Double-negatives. Don't.

Using double negatives is just very not not wrong

Short, generic, or type-based variable names. Name variables clearly.

There are only two hard things in Computer Science: cache invalidation and naming things. — Phil Karlton

Hard-coding primitive strings and numbers without descriptions. If your logic relies on a constant primitive string or numeric value, identify it.

Avoiding simple difficulties with sloppy shortcuts and workarounds. Avoid evasion. Take stock.

Considering lengthier code better. Shorter code is usually preferable. Only write lengthier versions if they improve code readability. For instance, don't utilize clever one-liners and nested ternary statements just to make the code shorter. In any application, removing unneeded code is better.

Measuring programming progress by lines of code is like measuring aircraft building progress by weight. — Bill Gates

Excessive conditional logic. Conditional logic is unnecessary for most tasks. Choose based on readability. Measure performance before optimizing. Avoid Yoda conditions and conditional assignments.

4) Selecting the First Approach

When I started programming, I would solve an issue and move on. I would apply my initial solution without considering its intricacies and probable shortcomings.

After questioning all the solutions, the best ones usually emerge. If you can't think of several answers, you don't grasp the problem.

Programmers do not solve problems. Find the easiest solution. The solution must work well and be easy to read, comprehend, and maintain.

There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. — C.A.R. Hoare

5) Not Giving Up

I generally stick with the original solution even though it may not be the best. The not-quitting mentality may explain this. This mindset is helpful for most things, but not programming. Program writers should fail early and often.

If you doubt a solution, toss it and rethink the situation. No matter how much you put in that solution. GIT lets you branch off and try various solutions. Use it.

Do not be attached to code because of how much effort you put into it. Bad code needs to be discarded.

6) Avoiding Google

I've wasted time solving problems when I should have researched them first.

Unless you're employing cutting-edge technology, someone else has probably solved your problem. Google It First.

Googling may discover that what you think is an issue isn't and that you should embrace it. Do not presume you know everything needed to choose a solution. Google surprises.

But Google carefully. Newbies also copy code without knowing it. Use only code you understand, even if it solves your problem.

Never assume you know how to code creatively.

The most dangerous thought that you can have as a creative person is to think that you know what you’re doing. — Bret Victor

7) Failing to Use Encapsulation

Not about object-oriented paradigm. Encapsulation is always useful. Unencapsulated systems are difficult to maintain.

An application should only handle a feature once. One object handles that. The application's other objects should only see what's essential. Reducing application dependencies is not about secrecy. Following these guidelines lets you safely update class, object, and function internals without breaking things.

Classify logic and state concepts. Class means blueprint template. Class or Function objects are possible. It could be a Module or Package.

Self-contained tasks need methods in a logic class. Methods should accomplish one thing well. Similar classes should share method names.

As a rookie programmer, I didn't always establish a new class for a conceptual unit or recognize self-contained units. Newbie code has a Util class full of unrelated code. Another symptom of novice code is when a small change cascades and requires numerous other adjustments.

Think before adding a method or new responsibilities to a method. Time's needed. Avoid skipping or refactoring. Start right.

High Cohesion and Low Coupling involves grouping relevant code in a class and reducing class dependencies.

8) Arranging for Uncertainty

Thinking beyond your solution is appealing. Every line of code will bring up what-ifs. This is excellent for edge cases but not for foreseeable needs.

Your what-ifs must fall into one of these two categories. Write only code you need today. Avoid future planning.

Writing a feature for future use is improper. No.

Write only the code you need today for your solution. Handle edge-cases, but don't introduce edge-features.

Growth for the sake of growth is the ideology of the cancer cell. — Edward Abbey

9) Making the incorrect data structure choices

Beginner programmers often overemphasize algorithms when preparing for interviews. Good algorithms should be identified and used when needed, but memorizing them won't make you a programming genius.

However, learning your language's data structures' strengths and shortcomings will make you a better developer.

The improper data structure shouts "newbie coding" here.

Let me give you a few instances of data structures without teaching you:

Managing records with arrays instead of maps (objects).

Most data structure mistakes include using lists instead of maps to manage records. Use a map to organize a list of records.

This list of records has an identifier to look up each entry. Lists for scalar values are OK and frequently superior, especially if the focus is pushing values to the list.

Arrays and objects are the most common JavaScript list and map structures, respectively (there is also a map structure in modern JavaScript).

Lists over maps for record management often fail. I recommend always using this point, even though it only applies to huge collections. This is crucial because maps are faster than lists in looking up records by identifier.

Stackless

Simple recursive functions are often tempting when writing recursive programming. In single-threaded settings, optimizing recursive code is difficult.

Recursive function returns determine code optimization. Optimizing a recursive function that returns two or more calls to itself is harder than optimizing a single call.

Beginners overlook the alternative to recursive functions. Use Stack. Push function calls to a stack and start popping them out to traverse them back.

10) Worsening the current code

Imagine this:

Add an item to that room. You might want to store that object anywhere as it's a mess. You can finish in seconds.

Not with messy code. Do not worsen! Keep the code cleaner than when you started.

Clean the room above to place the new object. If the item is clothing, clear a route to the closet. That's proper execution.

The following bad habits frequently make code worse:

  • code duplication You are merely duplicating code and creating more chaos if you copy/paste a code block and then alter just the line after that. This would be equivalent to adding another chair with a lower base rather than purchasing a new chair with a height-adjustable seat in the context of the aforementioned dirty room example. Always keep abstraction in mind, and use it when appropriate.

  • utilizing configuration files not at all. A configuration file should contain the value you need to utilize if it may differ in certain circumstances or at different times. A configuration file should contain a value if you need to use it across numerous lines of code. Every time you add a new value to the code, simply ask yourself: "Does this value belong in a configuration file?" The most likely response is "yes."

  • using temporary variables and pointless conditional statements. Every if-statement represents a logic branch that should at the very least be tested twice. When avoiding conditionals doesn't compromise readability, it should be done. The main issue with this is that branch logic is being used to extend an existing function rather than creating a new function. Are you altering the code at the appropriate level, or should you go think about the issue at a higher level every time you feel you need an if-statement or a new function variable?

This code illustrates superfluous if-statements:

function isOdd(number) {
  if (number % 2 === 1) {
    return true;
  } else {
    return false;
  }
}

Can you spot the biggest issue with the isOdd function above?

Unnecessary if-statement. Similar code:

function isOdd(number) {
  return (number % 2 === 1);
};

11) Making remarks on things that are obvious

I've learnt to avoid comments. Most code comments can be renamed.

instead of:

// This function sums only odd numbers in an array
const sum = (val) => {
  return val.reduce((a, b) => {
    if (b % 2 === 1) { // If the current number is odd
      a+=b;            // Add current number to accumulator
    }
    return a;          // The accumulator
  }, 0);
};

Commentless code looks like this:

const sumOddValues = (array) => {
  return array.reduce((accumulator, currentNumber) => {
    if (isOdd(currentNumber)) { 
      return accumulator + currentNumber;
    }
    return accumulator;
  }, 0);
};

Better function and argument names eliminate most comments. Remember that before commenting.

Sometimes you have to use comments to clarify the code. This is when your comments should answer WHY this code rather than WHAT it does.

Do not write a WHAT remark to clarify the code. Here are some unnecessary comments that clutter code:

// create a variable and initialize it to 0
let sum = 0;
// Loop over array
array.forEach(
  // For each number in the array
  (number) => {
    // Add the current number to the sum variable
    sum += number;
  }
);

Avoid that programmer. Reject that code. Remove such comments if necessary. Most importantly, teach programmers how awful these remarks are. Tell programmers who publish remarks like this that they may lose their jobs. That terrible.

12) Skipping tests

I'll simplify. If you develop code without tests because you think you're an excellent programmer, you're a rookie.

If you're not writing tests in code, you're probably testing manually. Every few lines of code in a web application will be refreshed and interacted with. Also. Manual code testing is fine. To learn how to automatically test your code, manually test it. After testing your application, return to your code editor and write code to automatically perform the same interaction the next time you add code.

Human. After each code update, you will forget to test all successful validations. Automate it!

Before writing code to fulfill validations, guess or design them. TDD is real. It improves your feature design thinking.

If you can use TDD, even partially, do so.

13) Making the assumption that if something is working, it must be right.

See this sumOddValues function. Is it flawed?

const sumOddValues = (array) => {
  return array.reduce((accumulator, currentNumber) => {
    if (currentNumber % 2 === 1) { 
      return accumulator + currentNumber;
    }
    return accumulator;
  });
};
 
 
console.assert(
  sumOddValues([1, 2, 3, 4, 5]) === 9
);

Verified. Good life. Correct?

Code above is incomplete. It handles some scenarios correctly, including the assumption used, but it has many other issues. I'll list some:

#1: No empty input handling. What happens when the function is called without arguments? That results in an error revealing the function's implementation:

TypeError: Cannot read property 'reduce' of undefined.

Two main factors indicate faulty code.

  • Your function's users shouldn't come across implementation-related information.

  • The user cannot benefit from the error. Simply said, they were unable to use your function. They would be aware that they misused the function if the error was more obvious about the usage issue. You might decide to make the function throw a custom exception, for instance:

TypeError: Cannot execute function for empty list.

Instead of returning an error, your method should disregard empty input and return a sum of 0. This case requires action.

Problem #2: No input validation. What happens if the function is invoked with a text, integer, or object instead of an array?

The function now throws:

sumOddValues(42);
TypeError: array.reduce is not a function

Unfortunately, array. cut's a function!

The function labels anything you call it with (42 in the example above) as array because we named the argument array. The error says 42.reduce is not a function.

See how that error confuses? An mistake like:

TypeError: 42 is not an array, dude.

Edge-cases are #1 and #2. These edge-cases are typical, but you should also consider less obvious ones. Negative numbers—what happens?

sumOddValues([1, 2, 3, 4, 5, -13]) // => still 9

-13's unusual. Is this the desired function behavior? Error? Should it sum negative numbers? Should it keep ignoring negative numbers? You may notice the function should have been titled sumPositiveOddNumbers.

This decision is simple. The more essential point is that if you don't write a test case to document your decision, future function maintainers won't know if you ignored negative values intentionally or accidentally.

It’s not a bug. It’s a feature. — Someone who forgot a test case

#3: Valid cases are not tested. Forget edge-cases, this function mishandles a straightforward case:

sumOddValues([2, 1, 3, 4, 5]) // => 11

The 2 above was wrongly included in sum.

The solution is simple: reduce accepts a second input to initialize the accumulator. Reduce will use the first value in the collection as the accumulator if that argument is not provided, like in the code above. The sum included the test case's first even value.

This test case should have been included in the tests along with many others, such as all-even numbers, a list with 0 in it, and an empty list.

Newbie code also has rudimentary tests that disregard edge-cases.

14) Adhering to Current Law

Unless you're a lone supercoder, you'll encounter stupid code. Beginners don't identify it and assume it's decent code because it works and has been in the codebase for a while.

Worse, if the terrible code uses bad practices, the newbie may be enticed to use them elsewhere in the codebase since they learnt them from good code.

A unique condition may have pushed the developer to write faulty code. This is a nice spot for a thorough note that informs newbies about that condition and why the code is written that way.

Beginners should presume that undocumented code they don't understand is bad. Ask. Enquire. Blame it!

If the code's author is dead or can't remember it, research and understand it. Only after understanding the code can you judge its quality. Before that, presume nothing.

15) Being fixated on best practices

Best practices damage. It suggests no further research. Best practice ever. No doubts!

No best practices. Today's programming language may have good practices.

Programming best practices are now considered bad practices.

Time will reveal better methods. Focus on your strengths, not best practices.

Do not do anything because you read a quote, saw someone else do it, or heard it is a recommended practice. This contains all my article advice! Ask questions, challenge theories, know your options, and make informed decisions.

16) Being preoccupied with performance

Premature optimization is the root of all evil (or at least most of it) in programming — Donald Knuth (1974)

I think Donald Knuth's advice is still relevant today, even though programming has changed.

Do not optimize code if you cannot measure the suspected performance problem.

Optimizing before code execution is likely premature. You may possibly be wasting time optimizing.

There are obvious optimizations to consider when writing new code. You must not flood the event loop or block the call stack in Node.js. Remember this early optimization. Will this code block the call stack?

Avoid non-obvious code optimization without measurements. If done, your performance boost may cause new issues.

Stop optimizing unmeasured performance issues.

17) Missing the End-User Experience as a Goal

How can an app add a feature easily? Look at it from your perspective or in the existing User Interface. Right? Add it to the form if the feature captures user input. Add it to your nested menu of links if it adds a link to a page.

Avoid that developer. Be a professional who empathizes with customers. They imagine this feature's consumers' needs and behavior. They focus on making the feature easy to find and use, not just adding it to the software.

18) Choosing the incorrect tool for the task

Every programmer has their preferred tools. Most tools are good for one thing and bad for others.

The worst tool for screwing in a screw is a hammer. Do not use your favorite hammer on a screw. Don't use Amazon's most popular hammer on a screw.

A true beginner relies on tool popularity rather than problem fit.

You may not know the best tools for a project. You may know the best tool. However, it wouldn't rank high. You must learn your tools and be open to new ones.

Some coders shun new tools. They like their tools and don't want to learn new ones. I can relate, but it's wrong.

You can build a house slowly with basic tools or rapidly with superior tools. You must learn and use new tools.

19) Failing to recognize that data issues are caused by code issues

Programs commonly manage data. The software will add, delete, and change records.

Even the simplest programming errors can make data unpredictable. Especially if the same defective application validates all data.

Code-data relationships may be confusing for beginners. They may employ broken code in production since feature X is not critical. Buggy coding may cause hidden data integrity issues.

Worse, deploying code that corrected flaws without fixing minor data problems caused by these defects will only collect more data problems that take the situation into the unrecoverable-level category.

How do you avoid these issues? Simply employ numerous data integrity validation levels. Use several interfaces. Front-end, back-end, network, and database validations. If not, apply database constraints.

Use all database constraints when adding columns and tables:

  • If a column has a NOT NULL constraint, null values will be rejected for that column. If your application expects that field has a value, your database should designate its source as not null.

  • If a column has a UNIQUE constraint, the entire table cannot include duplicate values for that column. This is ideal for a username or email field on a Users table, for instance.

  • For the data to be accepted, a CHECK constraint, or custom expression, must evaluate to true. For instance, you can apply a check constraint to ensure that the values of a normal % column must fall within the range of 0 and 100.

  • With a PRIMARY KEY constraint, the values of the columns must be both distinct and not null. This one is presumably what you're utilizing. To distinguish the records in each table, the database needs have a primary key.

  • A FOREIGN KEY constraint requires that the values in one database column, typically a primary key, match those in another table column.

Transaction apathy is another data integrity issue for newbies. If numerous actions affect the same data source and depend on each other, they must be wrapped in a transaction that can be rolled back if one fails.

20) Reinventing the Wheel

Tricky. Some programming wheels need reinvention. Programming is undefined. New requirements and changes happen faster than any team can handle.

Instead of modifying the wheel we all adore, maybe we should rethink it if you need a wheel that spins at varied speeds depending on the time of day. If you don't require a non-standard wheel, don't reinvent it. Use the darn wheel.

Wheel brands can be hard to choose from. Research and test before buying! Most software wheels are free and transparent. Internal design quality lets you evaluate coding wheels. Try open-source wheels. Debug and fix open-source software simply. They're easily replaceable. In-house support is also easy.

If you need a wheel, don't buy a new automobile and put your maintained car on top. Do not include a library to use a few functions. Lodash in JavaScript is the finest example. Import shuffle to shuffle an array. Don't import lodash.

21) Adopting the incorrect perspective on code reviews

Beginners often see code reviews as criticism. Dislike them. Not appreciated. Even fear them.

Incorrect. If so, modify your mindset immediately. Learn from every code review. Salute them. Observe. Most crucial, thank reviewers who teach you.

Always learning code. Accept it. Most code reviews teach something new. Use these for learning.

You may need to correct the reviewer. If your code didn't make that evident, it may need to be changed. If you must teach your reviewer, remember that teaching is one of the most enjoyable things a programmer can do.

22) Not Using Source Control

Newbies often underestimate Git's capabilities.

Source control is more than sharing your modifications. It's much bigger. Clear history is source control. The history of coding will assist address complex problems. Commit messages matter. They are another way to communicate your implementations, and utilizing them with modest commits helps future maintainers understand how the code got where it is.

Commit early and often with present-tense verbs. Summarize your messages but be detailed. If you need more than a few lines, your commit is too long. Rebase!

Avoid needless commit messages. Commit summaries should not list new, changed, or deleted files. Git commands can display that list from the commit object. The summary message would be noise. I think a big commit has many summaries per file altered.

Source control involves discoverability. You can discover the commit that introduced a function and see its context if you doubt its need or design. Commits can even pinpoint which code caused a bug. Git has a binary search within commits (bisect) to find the bug-causing commit.

Source control can be used before commits to great effect. Staging changes, patching selectively, resetting, stashing, editing, applying, diffing, reversing, and others enrich your coding flow. Know, use, and enjoy them.

I consider a Git rookie someone who knows less functionalities.

23) Excessive Use of Shared State

Again, this is not about functional programming vs. other paradigms. That's another article.

Shared state is problematic and should be avoided if feasible. If not, use shared state as little as possible.

As a new programmer, I didn't know that all variables represent shared states. All variables in the same scope can change its data. Global scope reduces shared state span. Keep new states in limited scopes and avoid upward leakage.

When numerous resources modify common state in the same event loop tick, the situation becomes severe (in event-loop-based environments). Races happen.

This shared state race condition problem may encourage a rookie to utilize a timer, especially if they have a data lock issue. Red flag. No. Never accept it.

24) Adopting the Wrong Mentality Toward Errors

Errors are good. Progress. They indicate a simple way to improve.

Expert programmers enjoy errors. Newbies detest them.

If these lovely red error warnings irritate you, modify your mindset. Consider them helpers. Handle them. Use them to advance.

Some errors need exceptions. Plan for user-defined exceptions. Ignore some mistakes. Crash and exit the app.

25) Ignoring rest periods

Humans require mental breaks. Take breaks. In the zone, you'll forget breaks. Another symptom of beginners. No compromises. Make breaks mandatory in your process. Take frequent pauses. Take a little walk to plan your next move. Reread the code.

This has been a long post. You deserve a break.

Jussi Luukkonen, MBA

Jussi Luukkonen, MBA

3 years ago

Is Apple Secretly Building A Disruptive Tsunami?

A TECHNICAL THOUGHT

The IT giant is seeding the digital Great Renaissance.

The Great Wave off Kanagawa by Hokusai— Image by WikiImages from Pixabay

Recently, technology has been dull.

We're still fascinated by processing speeds. Wearables are no longer an engineer's dream.

Apple has been quiet and avoided huge announcements. Slowness speaks something. Everything in the spaceship HQ seems to be turning slowly, unlike competitors around buzzwords.

Is this a sign of the impending storm?

Metas stock has fallen while Google milks dumb people. Microsoft steals money from corporations and annexes platforms like Linkedin.

Just surface bubbles?

Is Apple, one of the technology continents, pushing against all others to create a paradigm shift?

The fundamental human right to privacy

Apple's unusual remarks emphasize privacy. They incorporate it into their business models and judgments.

Apple believes privacy is a human right. There are no compromises.

This makes it hard for other participants to gain Apple's ecosystem's efficiencies.

Other players without hardware platforms lose.

Apple delivers new kidneys without rejection, unlike other software vendors. Nothing compromises your privacy.

Corporate citizenship will become more popular.

Apples have full coffers. They've started using that flow to better communities, which is great.

Apple's $2.5B home investment is one example. Google and Facebook are building or proposing to build workforce housing.

Apple's funding helps marginalized populations in more than 25 California counties, not just Apple employees.

Is this a trend, and does Apple keep giving back? Hope so.

I'm not cynical enough to suspect these investments have malicious motives.

The last frontier is the environment.

Climate change is a battle-to-win.

Long-term winners will be companies that protect the environment, turning climate change dystopia into sustainable growth.

Apple has been quietly changing its supply chain to be carbon-neutral by 2030.

“Apple is dedicated to protecting the planet we all share with solutions that are supporting the communities where we work.” Lisa Jackson, Apple’s vice president of environment.

Apple's $4.7 billion Green Bond investment will produce 1.2 gigawatts of green energy for the corporation and US communities. Apple invests $2.2 billion in Europe's green energy. In the Philippines, Thailand, Nigeria, Vietnam, Colombia, Israel, and South Africa, solar installations are helping communities obtain sustainable energy.

Apple is already carbon neutral today for its global corporate operations, and this new commitment means that by 2030, every Apple device sold will have net zero climate impact. -Apple.

Apple invests in green energy and forests to reduce its paper footprint in China and the US. Apple and the Conservation Fund are safeguarding 36,000 acres of US working forest, according to GreenBiz.

Apple's packaging paper is recycled or from sustainably managed forests.

What matters is the scale.

$1 billion is a rounding error for Apple.

These small investments originate from a tree with deep, spreading roots.

Apple's genes are anchored in building the finest products possible to improve consumers' lives.

I felt it when I switched to my iPhone while waiting for a train and had to pack my Macbook. iOS 16 dictation makes writing more enjoyable. Small change boosts productivity. Smooth transition from laptop to small screen and dictation.

Apples' tiny, well-planned steps have great growth potential for all consumers in everything they do.

There is clearly disruption, but it doesn't have to be violent

Digital channels, methods, and technologies have globalized human consciousness. One person's responsibility affects many.

Apple gives us tools to be privately connected. These technologies foster creativity, innovation, fulfillment, and safety.

Apple has invented a mountain of technologies, services, and channels to assist us adapt to the good future or combat evil forces who cynically aim to control us and ruin the environment and communities. Apple has quietly disrupted sectors for decades.

Google, Microsoft, and Meta, among others, should ride this wave. It's a tsunami, but it doesn't have to be devastating if we care, share, and cooperate with political decision-makers and community leaders worldwide.

A fresh Renaissance

Renaissance geniuses Michelangelo and Da Vinci. Different but seeing something no one else could yet see. Both were talented in many areas and could discover art in science and science in art.

These geniuses exemplified a period that changed humanity for the better. They created, used, and applied new, valuable things. It lives on.

Apple is a digital genius orchard. Wozniak and Jobs offered us fertile ground for the digital renaissance. We'll build on their legacy.

We may put our seeds there and see them bloom despite corporate greed and political ignorance.

I think the coming tsunami will illuminate our planet like the Renaissance.