Integrity
Write
Loading...
TheRedKnight

TheRedKnight

3 years ago

Say goodbye to Ponzi yields - A new era of decentralized perpetual

More on Web3 & Crypto

Protos

Protos

3 years ago

StableGains lost $42M in Anchor Protocol.

StableGains lost millions of dollars in customer funds in Anchor Protocol without telling its users. The Anchor Protocol offered depositors 19-20% APY before its parent ecosystem, Terra LUNA, lost tens of billions of dollars in market capitalization as LUNA fell below $0.01 and its stablecoin (UST) collapsed.

A Terra Research Forum member raised the alarm. StableGains changed its homepage and Terms and Conditions to reflect how it mitigates risk, a tacit admission that it should have done so from the start.

StableGains raised $600,000 in YCombinator's W22 batch. Moonfire, Broom Ventures, and Goodwater Capital invested $3 million more.

StableGains' 15% yield product attracted $42 million in deposits. StableGains kept most of its deposits in Anchor's UST pool earning 19-20% APY, kept one-quarter of the interest as a management fee, and then gave customers their promised 15% APY. It lost almost all customer funds when UST melted down. It changed withdrawal times, hurting customers.

  • StableGains said de-pegging was unlikely. According to its website, 1 UST can be bought and sold for $1 of LUNA. LUNA became worthless, and Terra shut down its blockchain.
  • It promised to diversify assets across several stablecoins to reduce the risk of one losing its $1 peg, but instead kept almost all of them in one basket.
  • StableGains promised withdrawals in three business days, even if a stablecoin needed time to regain its peg. StableGains uses Coinbase for deposits and withdrawals, and customers receive the exact amount of USDC requested.

StableGains scrubs its website squeaky clean

StableGains later edited its website to say it only uses the "most trusted and tested stablecoins" and extended withdrawal times from three days to indefinite time "in extreme cases."

Previously, USDC, TerraUST (UST), and Dai were used (DAI). StableGains changed UST-related website content after the meltdown. It also removed most references to DAI.

Customers noticed a new clause in the Terms and Conditions denying StableGains liability for withdrawal losses. This new clause would have required customers to agree not to sue before withdrawing funds, avoiding a class-action lawsuit.


Customers must sign a waiver to receive a refund.

Erickson Kramer & Osborne law firm has asked StableGains to preserve all internal documents on customer accounts, marketing, and TerraUSD communications. The firm has not yet filed a lawsuit.


Thousands of StableGains customers lost an estimated $42 million.

Celsius Network customers also affected

CEL used Terra LUNA's Anchor Protocol. Celsius users lost money in the crypto market crash and UST meltdown. Many held CEL and LUNA as yielding deposits.

CEO Alex Mashinsky accused "unknown malefactors" of targeting Celsius Network without evidence. Celsius has not publicly investigated this claim as of this article's publication.

CEL fell before UST de-pegged. On June 2, 2021, it reached $8.01. May 19's close: $0.82.

When some Celsius Network users threatened to leave over token losses, Mashinsky replied, "Leave if you don't think I'm sincere and working harder than you, seven days a week."

Celsius Network withdrew $500 million from Anchor Protocol, but smaller holders had trouble.

Read original article here

Vitalik

Vitalik

4 years ago

An approximate introduction to how zk-SNARKs are possible (part 1)

You can make a proof for the statement "I know a secret number such that if you take the word ‘cow', add the number to the end, and SHA256 hash it 100 million times, the output starts with 0x57d00485aa". The verifier can verify the proof far more quickly than it would take for them to run 100 million hashes themselves, and the proof would also not reveal what the secret number is.

In the context of blockchains, this has 2 very powerful applications: Perhaps the most powerful cryptographic technology to come out of the last decade is general-purpose succinct zero knowledge proofs, usually called zk-SNARKs ("zero knowledge succinct arguments of knowledge"). A zk-SNARK allows you to generate a proof that some computation has some particular output, in such a way that the proof can be verified extremely quickly even if the underlying computation takes a very long time to run. The "ZK" part adds an additional feature: the proof can keep some of the inputs to the computation hidden.

You can make a proof for the statement "I know a secret number such that if you take the word ‘cow', add the number to the end, and SHA256 hash it 100 million times, the output starts with 0x57d00485aa". The verifier can verify the proof far more quickly than it would take for them to run 100 million hashes themselves, and the proof would also not reveal what the secret number is.

In the context of blockchains, this has two very powerful applications:

  1. Scalability: if a block takes a long time to verify, one person can verify it and generate a proof, and everyone else can just quickly verify the proof instead
  2. Privacy: you can prove that you have the right to transfer some asset (you received it, and you didn't already transfer it) without revealing the link to which asset you received. This ensures security without unduly leaking information about who is transacting with whom to the public.

But zk-SNARKs are quite complex; indeed, as recently as in 2014-17 they were still frequently called "moon math". The good news is that since then, the protocols have become simpler and our understanding of them has become much better. This post will try to explain how ZK-SNARKs work, in a way that should be understandable to someone with a medium level of understanding of mathematics.

Why ZK-SNARKs "should" be hard

Let us take the example that we started with: we have a number (we can encode "cow" followed by the secret input as an integer), we take the SHA256 hash of that number, then we do that again another 99,999,999 times, we get the output, and we check what its starting digits are. This is a huge computation.

A "succinct" proof is one where both the size of the proof and the time required to verify it grow much more slowly than the computation to be verified. If we want a "succinct" proof, we cannot require the verifier to do some work per round of hashing (because then the verification time would be proportional to the computation). Instead, the verifier must somehow check the whole computation without peeking into each individual piece of the computation.

One natural technique is random sampling: how about we just have the verifier peek into the computation in 500 different places, check that those parts are correct, and if all 500 checks pass then assume that the rest of the computation must with high probability be fine, too?

Such a procedure could even be turned into a non-interactive proof using the Fiat-Shamir heuristic: the prover computes a Merkle root of the computation, uses the Merkle root to pseudorandomly choose 500 indices, and provides the 500 corresponding Merkle branches of the data. The key idea is that the prover does not know which branches they will need to reveal until they have already "committed to" the data. If a malicious prover tries to fudge the data after learning which indices are going to be checked, that would change the Merkle root, which would result in a new set of random indices, which would require fudging the data again... trapping the malicious prover in an endless cycle.

But unfortunately there is a fatal flaw in naively applying random sampling to spot-check a computation in this way: computation is inherently fragile. If a malicious prover flips one bit somewhere in the middle of a computation, they can make it give a completely different result, and a random sampling verifier would almost never find out.


It only takes one deliberately inserted error, that a random check would almost never catch, to make a computation give a completely incorrect result.

If tasked with the problem of coming up with a zk-SNARK protocol, many people would make their way to this point and then get stuck and give up. How can a verifier possibly check every single piece of the computation, without looking at each piece of the computation individually? There is a clever solution.

see part 2

Ashraful Islam

Ashraful Islam

4 years ago

Clean API Call With React Hooks

Photo by Juanjo Jaramillo on Unsplash

Calling APIs is the most common thing to do in any modern web application. When it comes to talking with an API then most of the time we need to do a lot of repetitive things like getting data from an API call, handling the success or error case, and so on.

When calling tens of hundreds of API calls we always have to do those tedious tasks. We can handle those things efficiently by putting a higher level of abstraction over those barebone API calls, whereas in some small applications, sometimes we don’t even care.

The problem comes when we start adding new features on top of the existing features without handling the API calls in an efficient and reusable manner. In that case for all of those API calls related repetitions, we end up with a lot of repetitive code across the whole application.

In React, we have different approaches for calling an API. Nowadays mostly we use React hooks. With React hooks, it’s possible to handle API calls in a very clean and consistent way throughout the application in spite of whatever the application size is. So let’s see how we can make a clean and reusable API calling layer using React hooks for a simple web application.

I’m using a code sandbox for this blog which you can get here.

import "./styles.css";
import React, { useEffect, useState } from "react";
import axios from "axios";

export default function App() {
  const [posts, setPosts] = useState(null);
  const [error, setError] = useState("");
  const [loading, setLoading] = useState(false);

  useEffect(() => {
    handlePosts();
  }, []);

  const handlePosts = async () => {
    setLoading(true);
    try {
      const result = await axios.get(
        "https://jsonplaceholder.typicode.com/posts"
      );
      setPosts(result.data);
    } catch (err) {
      setError(err.message || "Unexpected Error!");
    } finally {
      setLoading(false);
    }
  };

  return (
    <div className="App">
      <div>
        <h1>Posts</h1>
        {loading && <p>Posts are loading!</p>}
        {error && <p>{error}</p>}
        <ul>
          {posts?.map((post) => (
            <li key={post.id}>{post.title}</li>
          ))}
        </ul>
      </div>
    </div>
  );
}

I know the example above isn’t the best code but at least it’s working and it’s valid code. I will try to improve that later. For now, we can just focus on the bare minimum things for calling an API.

Here, you can try to get posts data from JsonPlaceholer. Those are the most common steps we follow for calling an API like requesting data, handling loading, success, and error cases.

If we try to call another API from the same component then how that would gonna look? Let’s see.

500: Internal Server Error

Now it’s going insane! For calling two simple APIs we’ve done a lot of duplication. On a top-level view, the component is doing nothing but just making two GET requests and handling the success and error cases. For each request, it’s maintaining three states which will periodically increase later if we’ve more calls.

Let’s refactor to make the code more reusable with fewer repetitions.

Step 1: Create a Hook for the Redundant API Request Codes

Most of the repetitions we have done so far are about requesting data, handing the async things, handling errors, success, and loading states. How about encapsulating those things inside a hook?

The only unique things we are doing inside handleComments and handlePosts are calling different endpoints. The rest of the things are pretty much the same. So we can create a hook that will handle the redundant works for us and from outside we’ll let it know which API to call.

500: Internal Server Error

Here, this request function is identical to what we were doing on the handlePosts and handleComments. The only difference is, it’s calling an async function apiFunc which we will provide as a parameter with this hook. This apiFunc is the only independent thing among any of the API calls we need.

With hooks in action, let’s change our old codes in App component, like this:

500: Internal Server Error

How about the current code? Isn’t it beautiful without any repetitions and duplicate API call handling things?

Let’s continue our journey from the current code. We can make App component more elegant. Now it knows a lot of details about the underlying library for the API call. It shouldn’t know that. So, here’s the next step…

Step 2: One Component Should Take Just One Responsibility

Our App component knows too much about the API calling mechanism. Its responsibility should just request the data. How the data will be requested under the hood, it shouldn’t care about that.

We will extract the API client-related codes from the App component. Also, we will group all the API request-related codes based on the API resource. Now, this is our API client:

import axios from "axios";

const apiClient = axios.create({
  // Later read this URL from an environment variable
  baseURL: "https://jsonplaceholder.typicode.com"
});

export default apiClient;

All API calls for comments resource will be in the following file:

import client from "./client";

const getComments = () => client.get("/comments");

export default {
  getComments
};

All API calls for posts resource are placed in the following file:

import client from "./client";

const getPosts = () => client.get("/posts");

export default {
  getPosts
};

Finally, the App component looks like the following:

import "./styles.css";
import React, { useEffect } from "react";
import commentsApi from "./api/comments";
import postsApi from "./api/posts";
import useApi from "./hooks/useApi";

export default function App() {
  const getPostsApi = useApi(postsApi.getPosts);
  const getCommentsApi = useApi(commentsApi.getComments);

  useEffect(() => {
    getPostsApi.request();
    getCommentsApi.request();
  }, []);

  return (
    <div className="App">
      {/* Post List */}
      <div>
        <h1>Posts</h1>
        {getPostsApi.loading && <p>Posts are loading!</p>}
        {getPostsApi.error && <p>{getPostsApi.error}</p>}
        <ul>
          {getPostsApi.data?.map((post) => (
            <li key={post.id}>{post.title}</li>
          ))}
        </ul>
      </div>
      {/* Comment List */}
      <div>
        <h1>Comments</h1>
        {getCommentsApi.loading && <p>Comments are loading!</p>}
        {getCommentsApi.error && <p>{getCommentsApi.error}</p>}
        <ul>
          {getCommentsApi.data?.map((comment) => (
            <li key={comment.id}>{comment.name}</li>
          ))}
        </ul>
      </div>
    </div>
  );
}

Now it doesn’t know anything about how the APIs get called. Tomorrow if we want to change the API calling library from axios to fetch or anything else, our App component code will not get affected. We can just change the codes form client.js This is the beauty of abstraction.

Apart from the abstraction of API calls, Appcomponent isn’t right the place to show the list of the posts and comments. It’s a high-level component. It shouldn’t handle such low-level data interpolation things.

So we should move this data display-related things to another low-level component. Here I placed those directly in the App component just for the demonstration purpose and not to distract with component composition-related things.

Final Thoughts

The React library gives the flexibility for using any kind of third-party library based on the application’s needs. As it doesn’t have any predefined architecture so different teams/developers adopted different approaches to developing applications with React. There’s nothing good or bad. We choose the development practice based on our needs/choices. One thing that is there beyond any choices is writing clean and maintainable codes.

You might also like

Adrien Book

Adrien Book

3 years ago

What is Vitalik Buterin's newest concept, the Soulbound NFT?

Decentralizing Web3's soul

Our tech must reflect our non-transactional connections. Web3 arose from a lack of social links. It must strengthen these linkages to get widespread adoption. Soulbound NFTs help.

This NFT creates digital proofs of our social ties. It embodies G. Simmel's idea of identity, in which individuality emerges from social groups, just as social groups evolve from people.

It's multipurpose. First, gather online our distinctive social features. Second, highlight and categorize social relationships between entities and people to create a spiderweb of networks.

1. 🌐 Reducing online manipulation: Only socially rich or respectable crypto wallets can participate in projects, ensuring that no one can create several wallets to influence decentralized project governance.

2. 🤝 Improving social links: Some sectors of society lack social context. Racism, sexism, and homophobia do that. Public wallets can help identify and connect distinct social groupings.

3. 👩‍❤️‍💋‍👨 Increasing pluralism: Soulbound tokens can ensure that socially connected wallets have less voting power online to increase pluralism. We can also overweight a minority of numerous voices.

4. 💰Making more informed decisions: Taking out an insurance policy requires a life review. Why not loans? Character isn't limited by income, and many people need a chance.

5. 🎶 Finding a community: Soulbound tokens are accessible to everyone. This means we can find people who are like us but also different. This is probably rare among your friends and family.

NFTs are dangerous, and I don't like them. Social credit score, privacy, lost wallet. We must stay informed and keep talking to innovators.

E. Glen Weyl, Puja Ohlhaver and Vitalik Buterin get all the credit for these ideas, having written the very accessible white paper “Decentralized Society: Finding Web3’s Soul”.

Erik Engheim

Erik Engheim

3 years ago

You Misunderstand the Russian Nuclear Threat

Many believe Putin is simply sabre rattling and intimidating us. They see no threat of nuclear war. We can send NATO troops into Ukraine without risking a nuclear war.

I keep reading that Putin is just using nuclear blackmail and that a strong leader will call the bluff. That, in my opinion, misunderstands the danger of sending NATO into Ukraine.
It assumes that once NATO moves in, Putin can either push the red nuclear button or not.
Sure, Putin won't go nuclear if NATO invades Ukraine. So we're safe? Can't we just move NATO?

No, because history has taught us that wars often escalate far beyond our initial expectations. One domino falls, knocking down another. That's why having clear boundaries is vital. Crossing a seemingly harmless line can set off a chain of events that are unstoppable once started.
One example is WWI. The assassin of Archduke Franz Ferdinand could not have known that his actions would kill millions. They couldn't have known that invading Serbia to punish them for not handing over the accomplices would start a world war. Every action triggered a counter-action, plunging Europe into a brutal and bloody war. Each leader saw their actions as limited, not realizing how they kept the dominos falling.

Nobody can predict the future, but it's easy to imagine how NATO intervention could trigger a chain of events leading to a total war. Let me suggest some outcomes.
NATO creates a no-fly-zone. In retaliation, Russia bombs NATO airfields. Russia may see this as a limited counter-move that shouldn't cause further NATO escalation. They think it's a reasonable response to force NATO out of Ukraine. Nobody has yet thought to use the nuke.
Will NATO act? Polish airfields bombed, will they be stuck? Is this an article 5 event? If so, what should be done?

It could happen. Maybe NATO sends troops into Ukraine to punish Russia. Maybe NATO will bomb Russian airfields.

Putin's response Is bombing Russian airfields an invasion or an attack? Remember that Russia has always used nuclear weapons for defense, not offense. But let's not panic, let's assume Russia doesn't go nuclear.

Maybe Russia retaliates by attacking NATO military bases with planes. Maybe they use ships to attack military targets. How does NATO respond? Will they fight Russia in Ukraine or escalate? Will they invade Russia or attack more military installations there?
Seen the pattern? As each nation responds, smaller limited military operations can grow in scope.

So far, the Russian military has shown that they begin with less brutal methods. As losses and failures increase, brutal means are used. Syria had the same. Assad used chemical weapons and attacked hospitals, schools, residential areas, etc.
A NATO invasion of Ukraine would cost Russia dearly. “Oh, this isn't looking so good, better pull out and finish this war,” do you think? No way. Desperate, they will resort to more brutal tactics. If desperate, Russia has a huge arsenal of ugly weapons. They have nerve agents, chemical weapons, and other nasty stuff.

What happens if Russia uses chemical weapons? What if Russian nerve agents kill NATO soldiers horribly? West calls for retaliation will grow. Will we invade Russia? Will we bomb them?

We are angry and determined to punish war criminal Putin, so NATO tanks may be heading to Moscow. We want vengeance for his chemical attacks and bombing of our cities.
Do you think the distance between that red nuclear button and Putin's finger will be that far once NATO tanks are on their way to Moscow?

We might avoid a nuclear apocalypse. A NATO invasion force or even Western cities may be used by Putin. Not as destructive as ICBMs. Putin may think we won't respond to tactical nukes with a full nuclear counterattack. Why would we risk a nuclear Holocaust by launching ICBMs on Russia?

Maybe. My point is that at every stage of the escalation, one party may underestimate the other's response. This war is spiraling out of control and the chances of a nuclear exchange are increasing. Nobody really wants it.

Fear, anger, and resentment cause it. If Putin and his inner circle decide their time is up, they may no longer care about the rest of the world. We saw it with Hitler. Hitler, seeing the end of his empire, ordered the destruction of Germany. Nobody should win if he couldn't. He wanted to destroy everything, including Paris.

In other words, the danger isn't what happens after NATO intervenes The danger is the potential chain reaction. Gambling has a psychological equivalent. It's best to exit when you've lost less. We humans are willing to take small risks for big rewards. To avoid losses, we are willing to take high risks. Daniel Kahneman describes this behavior in his book Thinking, Fast and Slow.

And so bettors who have lost a lot begin taking bigger risks to make up for it. We get a snowball effect. NATO involvement in the Ukraine conflict is akin to entering a casino and placing a bet. We'll start taking bigger risks as we start losing to Russian retaliation. That's the game's psychology.

It's impossible to stop. So will politicians and citizens from both Russia and the West, until we risk the end of human civilization.

You can avoid spiraling into ever larger bets in the Casino by drawing a hard line and declaring “I will not enter that Casino.” We're doing it now. We supply Ukraine. We send money and intelligence but don't cross that crucial line.

It's difficult to watch what happened in Bucha without demanding NATO involvement. What should we do? Of course, I'm not in charge. I'm a writer. My hope is that people will think about the consequences of the actions we demand. My hope is that you think ahead not just one step but multiple dominos.

More and more, we are driven by our emotions. We cannot act solely on emotion in matters of life and death. If we make the wrong choice, more people will die.

Read the original post here.

Scrum Ventures

Scrum Ventures

3 years ago

Trends from the Winter 2022 Demo Day at Y Combinators

Y Combinators Winter 2022 Demo Day continues the trend of more startups engaging in accelerator Demo Days. Our team evaluated almost 400 projects in Y Combinator's ninth year.

After Winter 2021 Demo Day, we noticed a hurry pushing shorter rounds, inflated valuations, and larger batches.

Despite the batch size, this event's behavior showed a return to normalcy. Our observations show that investors evaluate and fund businesses more carefully. Unlike previous years, more YC businesses gave investors with data rooms and thorough pitch decks in addition to valuation data before Demo Day.

Demo Day pitches were virtual and fast-paced, limiting unplanned meetings. Investors had more time and information to do their due research before meeting founders. Our staff has more time to study diverse areas and engage with interesting entrepreneurs and founders.

This was one of the most regionally diversified YC cohorts to date. This year's Winter Demo Day startups showed some interesting tendencies.

Trends and Industries to Watch Before Demo Day

Demo day events at any accelerator show how investment competition is influencing startups. As startups swiftly become scale-ups and big success stories in fintech, e-commerce, healthcare, and other competitive industries, entrepreneurs and early-stage investors feel pressure to scale quickly and turn a notion into actual innovation.

Too much eagerness can lead founders to focus on market growth and team experience instead of solid concepts, technical expertise, and market validation. Last year, YC Winter Demo Day funding cycles ended too quickly and valuations were unrealistically high.

Scrum Ventures observed a longer funding cycle this year compared to last year's Demo Day. While that seems promising, many factors could be contributing to change, including:

  • Market patterns are changing and the economy is becoming worse.

  • the industries that investors are thinking about.

  • Individual differences between each event batch and the particular businesses and entrepreneurs taking part

The Winter 2022 Batch's Trends

Each year, we also wish to examine trends among early-stage firms and YC event participants. More international startups than ever were anticipated to present at Demo Day.

Less than 50% of demo day startups were from the U.S. For the S21 batch, firms from outside the US were most likely in Latin America or Europe, however this year's batch saw a large surge in startups situated in Asia and Africa.

YC Startup Directory

163 out of 399 startups were B2B software and services companies. Financial, healthcare, and consumer startups were common.

Our team doesn't plan to attend every pitch or speak with every startup's founders or team members. Let's look at cleantech, Web3, and health and wellness startup trends.

Our Opinions Following Conversations with 87 Startups at Demo Day

In the lead-up to Demo Day, we spoke with 87 of the 125 startups going. Compared to B2C enterprises, B2B startups had higher average valuations. A few outliers with high valuations pushed B2B and B2C means above the YC-wide mean and median.

Many of these startups develop business and technology solutions we've previously covered. We've seen API, EdTech, creative platforms, and cybersecurity remain strong and increase each year.

While these persistent tendencies influenced the startups Scrum Ventures looked at and the founders we interacted with on Demo Day, new trends required more research and preparation. Let's examine cleantech, Web3, and health and wellness startups.

Hardware and software that is green

Cleantech enterprises demand varying amounts of funding for hardware and software. Although the same overarching trend is fueling the growth of firms in this category, each subgroup has its own strategy and technique for investigation and identifying successful investments.

Many cleantech startups we spoke to during the YC event are focused on helping industrial operations decrease or recycle carbon emissions.

  • Carbon Crusher: Creating carbon negative roads

  • Phase Biolabs: Turning carbon emissions into carbon negative products and carbon neutral e-fuels

  • Seabound: Capturing carbon dioxide emissions from ships

  • Fleetzero: Creating electric cargo ships

  • Impossible Mining: Sustainable seabed mining

  • Beyond Aero: Creating zero-emission private aircraft

  • Verdn: Helping businesses automatically embed environmental pledges for product and service offerings, boost customer engagement

  • AeonCharge: Allowing electric vehicle (EV) drivers to more easily locate and pay for EV charging stations

  • Phoenix Hydrogen: Offering a hydrogen marketplace and a connected hydrogen hub platform to connect supply and demand for hydrogen fuel and simplify hub planning and partner program expansion

  • Aklimate: Allowing businesses to measure and reduce their supply chain’s environmental impact

  • Pina Earth: Certifying and tracking the progress of businesses’ forestry projects

  • AirMyne: Developing machines that can reverse emissions by removing carbon dioxide from the air

  • Unravel Carbon: Software for enterprises to track and reduce their carbon emissions

Web3: NFTs, the metaverse, and cryptocurrency

Web3 technologies handle a wide range of business issues. This category includes companies employing blockchain technology to disrupt entertainment, finance, cybersecurity, and software development.

Many of these startups overlap with YC's FinTech trend. Despite this, B2C and B2B enterprises were evenly represented in Web3. We examined:

  • Stablegains: Offering consistent interest on cash balance from the decentralized finance (DeFi) market

  • LiquiFi: Simplifying token management with automated vesting contracts, tax reporting, and scheduling. For companies, investors, and finance & accounting

  • NFTScoring: An NFT trading platform

  • CypherD Wallet: A multichain wallet for crypto and NFTs with a non-custodial crypto debit card that instantly converts coins to USD

  • Remi Labs: Allowing businesses to more easily create NFT collections that serve as access to products, memberships, events, and more

  • Cashmere: A crypto wallet for Web3 startups to collaboratively manage funds

  • Chaingrep: An API that makes blockchain data human-readable and tokens searchable

  • Courtyard: A platform for securely storing physical assets and creating 3D representations as NFTs

  • Arda: “Banking as a Service for DeFi,” an API that FinTech companies can use to embed DeFi products into their platforms

  • earnJARVIS: A premium cryptocurrency management platform, allowing users to create long-term portfolios

  • Mysterious: Creating community-specific experiences for Web3 Discords

  • Winter: An embeddable widget that allows businesses to sell NFTs to users purchasing with a credit card or bank transaction

  • SimpleHash: An API for NFT data that provides compatibility across blockchains, standardized metadata, accurate transaction info, and simple integration

  • Lifecast: Tools that address motion sickness issues for 3D VR video

  • Gym Class: Virtual reality (VR) multiplayer basketball video game

  • WorldQL: An asset API that allows NFT creators to specify multiple in-game interpretations of their assets, increasing their value

  • Bonsai Desk: A software development kit (SDK) for 3D analytics

  • Campfire: Supporting virtual social experiences for remote teams

  • Unai: A virtual headset and Visual World experience

  • Vimmerse: Allowing creators to more easily create immersive 3D experiences

Fitness and health

Scrum Ventures encountered fewer health and wellness startup founders than Web3 and Cleantech. The types of challenges these organizations solve are still diverse. Several of these companies are part of a push toward customization in healthcare, an area of biotech set for growth for companies with strong portfolios and experienced leadership.

Here are several startups we considered:

  • Syrona Health: Personalized healthcare for women in the workplace

  • Anja Health: Personalized umbilical cord blood banking and stem cell preservation

  • Alfie: A weight loss program focused on men’s health that coordinates medical care, coaching, and “community-based competition” to help users lose an average of 15% body weight

  • Ankr Health: An artificial intelligence (AI)-enabled telehealth platform that provides personalized side effect education for cancer patients and data collection for their care teams

  • Koko — A personalized sleep program to improve at-home sleep analysis and training

  • Condition-specific telehealth platforms and programs:

  • Reviving Mind: Chronic care management covered by insurance and supporting holistic, community-oriented health care

  • Equipt Health: At-home delivery of prescription medical equipment to help manage chronic conditions like obstructive sleep apnea

  • LunaJoy: Holistic women’s healthcare management for mental health therapy, counseling, and medication

12 Startups from YC's Winter 2022 Demo Day to Watch

Bobidi: 10x faster AI model improvement

Artificial intelligence (AI) models have become a significant tool for firms to improve how well and rapidly they process data. Bobidi helps AI-reliant firms evaluate their models, boosting data insights in less time and reducing data analysis expenditures. The business has created a gamified community that offers a bug bounty for AI, incentivizing community members to test and find weaknesses in clients' AI models.

Magna: DeFi investment management and token vesting

Magna delivers rapid, secure token vesting so consumers may turn DeFi investments into primitives. Carta for Web3 allows enterprises to effortlessly distribute tokens to staff or investors. The Magna team hopes to allow corporations use locked tokens as collateral for loans, facilitate secondary liquidity so investors can sell shares on a public exchange, and power additional DeFi applications.

Perl Street: Funding for infrastructure

This Fintech firm intends to help hardware entrepreneurs get financing by [democratizing] structured finance, unleashing billions for sustainable infrastructure and next-generation hardware solutions. This network has helped hardware entrepreneurs achieve more than $140 million in finance, helping companies working on energy storage devices, EVs, and creating power infrastructure.

CypherD: Multichain cryptocurrency wallet

CypherD seeks to provide a multichain crypto wallet so general customers can explore Web3 products without knowledge hurdles. The startup's beta app lets consumers access crypto from EVM blockchains. The founders have crypto, financial, and startup experience.

Unravel Carbon: Enterprise carbon tracking and offsetting

Unravel Carbon's AI-powered decarbonization technology tracks companies' carbon emissions. Singapore-based startup focuses on Asia. The software can use any company's financial data to trace the supply chain and calculate carbon tracking, which is used to make regulatory disclosures and suggest carbon offsets.

LunaJoy: Precision mental health for women

LunaJoy helped women obtain mental health support throughout life. The platform combines data science to create a tailored experience, allowing women to access psychotherapy, medication management, genetic testing, and health coaching.

Posh: Automated EV battery recycling

Posh attempts to solve one of the EV industry's largest logistical difficulties. Millions of EV batteries will need to be decommissioned in the next decade, and their precious metals and residual capacity will go unused for some time. Posh offers automated, scalable lithium battery disassembly, making EV battery recycling more viable.

Unai: VR headset with 5x higher resolution

Unai stands apart from metaverse companies. Its VR headgear has five times the resolution of existing options and emphasizes human expression and interaction in a remote world. Maxim Perumal's method of latency reduction powers current VR headsets.

Palitronica: Physical infrastructure cybersecurity

Palitronica blends cutting-edge hardware and software to produce networked electronic systems that support crucial physical and supply chain infrastructure. The startup's objective is to build solutions that defend national security and key infrastructure from cybersecurity threats.

Reality Defender: Deepfake detection

Reality Defender alerts firms to bogus users and changed audio, video, and image files. Reality Deference's API and web app score material in real time to prevent fraud, improve content moderation, and detect deception.

Micro Meat: Infrastructure for the manufacture of cell-cultured meat

MicroMeat promotes sustainable meat production. The company has created technologies to scale up bioreactor-grown meat muscle tissue from animal cells. Their goal is to scale up cultured meat manufacturing so cultivated meat products can be brought to market feasibly and swiftly, boosting worldwide meat consumption.

Fleetzero: Electric cargo ships

This startup's battery technology will make cargo ships more sustainable and profitable. Fleetzero's electric cargo ships have five times larger profit margins than fossil fuel ships. Fleetzeros' founder has marine engineering, ship operations, and enterprise sales and business experience.