Bitcoin Price Index API - CoinDesk

[OC] Which front offices and agents are the 3 major newsbreakers connected to? I went through 6000+ tweets to find out!

If this sounds somewhat familiar, that's because I did a 2019-2020 version and posted it back in March.
In terms of changes from that post:
TL;DR
Tracked tweets of Woj, Shams and Haynes from 2018-2020 to see whether any of them report on a certain team or a certain agent's players more than their counterparts. Here is the main graph concerning a reporter's percentage of tweets per team separated into three periods (2019 season, 2020 offseason, 2020 season). Here is a separate graph with the Lakers and Warriors, because Haynes's percentages would skew the first graph.

During times like the NBA trade deadline or the lifting of the NBA free-agency moratorium, it’s not uncommon to see Twitter replies to (or Reddit comments about) star reporters reference their performance relative to others.
Woj is the preeminent scoop hound, but he is also notorious for writing hit pieces on LeBron (sources say it’s been widely rumoured that the reason for these is that Woj has always been unable to place a reliable source in LeBron’s camp). On the other end of the spectrum, it has been revealed that in exchange for exclusive intel on league memos and Pistons dealings, Woj wrote puff pieces on then-GM Joe Dumars (see above Kevin Draper link). Last summer, Woj was accused of being a Clippers shill on this very discussion board for noticeably driving the Kawhi Leonard free agency conversation towards the team.
This is the reason I undertook this project: to see whether some reporters have more sources in certain teams (and certain agencies) than other reporters.
First I’ll explain the methodology, then present the data with some initial comments.

Methodology

To make this manageable on myself, I limited myself to tracking the 3 major national reporters: Shams Charania of the Athletic, Chris Haynes of Yahoo Sports and the aforementioned Adrian Wojnarowski of ESPN.
The time period I initially tracked for was from January 1, 2020 to the end of the regular season March, but after finding a Twitter scraping tool on GitHub called Twint, I was able to easily retrieve all tweets since September 27, 2018. However, a month ago, Twitter closed their old API endpoints, and Twint ceased to work. I used vicinitas.io but the data loading became more time-consuming. Therefore, the tweets are up to the date of October 15 2020.
How I determined information was by manually parsing text tweets by the reporter (no retweets):
Now, I didn’t take every single text tweet:
Next, I had to assign possible teams to each tweet:
With all the methodology out of the way, here’s the data! (Here’s a link to a full Google Sheet)

Teams

Here's a graph of number of tweets per team per period, with the colours denoting reporter.
On a quick glance, here's which teams saw a significant period-over-period increase in number of tweets:
And here's which teams saw decreases over a period-by-period basis:
The problem with just using number of tweets is that it's not close on totals between Haynes vs. Woj or Shams. Here's a graph showing total number of tweets in each period for all three reporters. Haynes's most reported period doesn't even stack up to the least reported period of Woj or Shams.
Instead, let's look at percentage of tweets per team per period.
Now, you'll notice that there's two teams missing from the above graph: the Golden State Warriors and the Los Angeles Lakers. Here's the graphs for those two teams. As you can see, they would skew the previous graph far too much. During the 2019 NBA season, 27% of Chris Haynes's qualifying tweets could be possibly linked to the Warriors, and 14% of his qualifying tweets could be possibly linked to the Lakers.

Agents

Here's the top 10 agents in terms of number of potential tweets concerning their clients.
Agent Haynes Shams Woj Total
Rich Paul 15 28 24 67
Mark Bartelstein 4 16 30 50
Jeff Schwartz 3 10 25 38
Bill Duffy 2 13 14 29
Leon Rose 1 12 15 28
Aaron Mintz 2 9 15 26
Juan Perez 5 10 8 23
Aaron Goodwin 11 8 1 20
Steven Heumann 1 6 12 19
Sam Permut 1 13 5 19
Woj has the most tweets directly connected to agents by far. It wasn't uncommon to see "Player X signs deal with Team Y, Agent Z of Agency F tells ESPN." The agents that go to Woj (and some of their top clients):
One thing I found very intriguing: 15/16 of tweets concerning an Aaron Turner client were reported on by Shams. Turner is the head of Verus Basketball, whose clients include Terry Rozier, Victor Oladipo and Kevin Knox. Shams also reported more than 50% of news relating to clients of Sam Permut of Roc Nation. Permut is the current agent of Kyrie Irving, after Irving fired Jeff Wechsler near the beginning of the 2019 offseason. Permut also reps the Morris brothers and Trey Burke.
As for Chris Haynes, he doesn't really do much agent news (at least not at the level of Woj and Shams). However, he reported more than 50% of news relating to clients of Aaron Goodwin of Goodwin Sports Management, who reps Damian Lillard and DeMar DeRozan.
Here are the top 10 free agents from Forbes, along with their agent and who I predict will be the first/only one to break the news.
Player Agent Most Likely Reporter
Anthony Davis Rich Paul Too close to call, leaning Shams
Brandon Ingram Jeff Schwartz Woj
DeMar DeRozan Aaron Goodwin Haynes
Fred VanVleet Brian Jungreis Limited data
Andre Drummond Jeff Schwartz Woj
Montrezl Harrell Rich Paul Too close to call, leaning Shams
Gordon Hayward Mark Bartelstein Woj
Danilo Gallinari Michael Tellem Woj
Bogdan Bogdanovic Alexander Raskovic, Jason Ranne Limited data, but part of Wasserman, whose players are predominantly reported on by Woj
Davis Bertans Arturs Kalnitis Limited data
Thanks for reading! As always with this type of work, human error is not completely eliminated. If you think a tweet was mistakenly removed, feel free to drop me a line and I’ll try to explain my thought process on that specific tweet! Hope y’all enjoyed the research!
submitted by cilantro_samosa to nba [link] [comments]

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Great Reddit Scaling Bake-Off Public Proposal

Dragonchain Public Proposal TL;DR:

Dragonchain has demonstrated twice Reddit’s entire total daily volume (votes, comments, and posts per Reddit 2019 Year in Review) in a 24-hour demo on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. At the time, in January 2020, the entire cost of the demo was approximately $25K on a single system (transaction fees locked at $0.0001/txn). With current fees (lowest fee $0.0000025/txn), this would cost as little as $625.
Watch Joe walk through the entire proposal and answer questions on YouTube.
This proposal is also available on the Dragonchain blog.

Hello Reddit and Ethereum community!

I’m Joe Roets, Founder & CEO of Dragonchain. When the team and I first heard about The Great Reddit Scaling Bake-Off we were intrigued. We believe we have the solutions Reddit seeks for its community points system and we have them at scale.
For your consideration, we have submitted our proposal below. The team at Dragonchain and I welcome and look forward to your technical questions, philosophical feedback, and fair criticism, to build a scaling solution for Reddit that will empower its users. Because our architecture is unlike other blockchain platforms out there today, we expect to receive many questions while people try to grasp our project. I will answer all questions here in this thread on Reddit, and I've answered some questions in the stream on YouTube.
We have seen good discussions so far in the competition. We hope that Reddit’s scaling solution will emerge from The Great Reddit Scaling Bake-Off and that Reddit will have great success with the implementation.

Executive summary

Dragonchain is a robust open source hybrid blockchain platform that has proven to withstand the passing of time since our inception in 2014. We have continued to evolve to harness the scalability of private nodes, yet take full advantage of the security of public decentralized networks, like Ethereum. We have a live, operational, and fully functional Interchain network integrating Bitcoin, Ethereum, Ethereum Classic, and ~700 independent Dragonchain nodes. Every transaction is secured to Ethereum, Bitcoin, and Ethereum Classic. Transactions are immediately usable on chain, and the first decentralization is seen within 20 seconds on Dragon Net. Security increases further to public networks ETH, BTC, and ETC within 10 minutes to 2 hours. Smart contracts can be written in any executable language, offering full freedom to existing developers. We invite any developer to watch the demo, play with our SDK’s, review open source code, and to help us move forward. Dragonchain specializes in scalable loyalty & rewards solutions and has built a decentralized social network on chain, with very affordable transaction costs. This experience can be combined with the insights Reddit and the Ethereum community have gained in the past couple of months to roll out the solution at a rapid pace.

Response and PoC

In The Great Reddit Scaling Bake-Off post, Reddit has asked for a series of demonstrations, requirements, and other considerations. In this section, we will attempt to answer all of these requests.

Live Demo

A live proof of concept showing hundreds of thousands of transactions
On Jan 7, 2020, Dragonchain hosted a 24-hour live demonstration during which a quarter of a billion (250 million+) transactions executed fully on an operational network. Every single transaction on Dragonchain is decentralized immediately through 5 levels of Dragon Net, and then secured with combined proof on Bitcoin, Ethereum, Ethereum Classic, and Binance Chain, via Interchain. This means that every single transaction is secured by, and traceable to these networks. An attack on this system would require a simultaneous attack on all of the Interchained networks.
24 hours in 4 minutes (YouTube):
24 hours in 4 minutes
The demonstration was of a single business system, and any user is able to scale this further, by running multiple systems simultaneously. Our goals for the event were to demonstrate a consistent capacity greater than that of Visa over an extended time period.
Tooling to reproduce our demo is available here:
https://github.com/dragonchain/spirit-bomb

Source Code

Source code (for on & off-chain components as well tooling used for the PoC). The source code does not have to be shared publicly, but if Reddit decides to use a particular solution it will need to be shared with Reddit at some point.

Scaling

How it works & scales

Architectural Scaling

Dragonchain’s architecture attacks the scalability issue from multiple angles. Dragonchain is a hybrid blockchain platform, wherein every transaction is protected on a business node to the requirements of that business or purpose. A business node may be held completely private or may be exposed or replicated to any level of exposure desired.
Every node has its own blockchain and is independently scalable. Dragonchain established Context Based Verification as its consensus model. Every transaction is immediately usable on a trust basis, and in time is provable to an increasing level of decentralized consensus. A transaction will have a level of decentralization to independently owned and deployed Dragonchain nodes (~700 nodes) within seconds, and full decentralization to BTC and ETH within minutes or hours. Level 5 nodes (Interchain nodes) function to secure all transactions to public or otherwise external chains such as Bitcoin and Ethereum. These nodes scale the system by aggregating multiple blocks into a single Interchain transaction on a cadence. This timing is configurable based upon average fees for each respective chain. For detailed information about Dragonchain’s architecture, and Context Based Verification, please refer to the Dragonchain Architecture Document.

Economic Scaling

An interesting feature of Dragonchain’s network consensus is its economics and scarcity model. Since Dragon Net nodes (L2-L4) are independent staking nodes, deployment to cloud platforms would allow any of these nodes to scale to take on a large percentage of the verification work. This is great for scalability, but not good for the economy, because there is no scarcity, and pricing would develop a downward spiral and result in fewer verification nodes. For this reason, Dragonchain uses TIME as scarcity.
TIME is calculated as the number of Dragons held, multiplied by the number of days held. TIME influences the user’s access to features within the Dragonchain ecosystem. It takes into account both the Dragon balance and length of time each Dragon is held. TIME is staked by users against every verification node and dictates how much of the transaction fees are awarded to each participating node for every block.
TIME also dictates the transaction fee itself for the business node. TIME is staked against a business node to set a deterministic transaction fee level (see transaction fee table below in Cost section). This is very interesting in a discussion about scaling because it guarantees independence for business implementation. No matter how much traffic appears on the entire network, a business is guaranteed to not see an increased transaction fee rate.

Scaled Deployment

Dragonchain uses Docker and Kubernetes to allow the use of best practices traditional system scaling. Dragonchain offers managed nodes with an easy to use web based console interface. The user may also deploy a Dragonchain node within their own datacenter or favorite cloud platform. Users have deployed Dragonchain nodes on-prem on Amazon AWS, Google Cloud, MS Azure, and other hosting platforms around the world. Any executable code, anything you can write, can be written into a smart contract. This flexibility is what allows us to say that developers with no blockchain experience can use any code language to access the benefits of blockchain. Customers have used NodeJS, Python, Java, and even BASH shell script to write smart contracts on Dragonchain.
With Docker containers, we achieve better separation of concerns, faster deployment, higher reliability, and lower response times.
We chose Kubernetes for its self-healing features, ability to run multiple services on one server, and its large and thriving development community. It is resilient, scalable, and automated. OpenFaaS allows us to package smart contracts as Docker images for easy deployment.
Contract deployment time is now bounded only by the size of the Docker image being deployed but remains fast even for reasonably large images. We also take advantage of Docker’s flexibility and its ability to support any language that can run on x86 architecture. Any image, public or private, can be run as a smart contract using Dragonchain.

Flexibility in Scaling

Dragonchain’s architecture considers interoperability and integration as key features. From inception, we had a goal to increase adoption via integration with real business use cases and traditional systems.
We envision the ability for Reddit, in the future, to be able to integrate alternate content storage platforms or other financial services along with the token.
  • LBRY - To allow users to deploy content natively to LBRY
  • MakerDAO to allow users to lend small amounts backed by their Reddit community points.
  • STORJ/SIA to allow decentralized on chain storage of portions of content. These integrations or any other are relatively easy to integrate on Dragonchain with an Interchain implementation.

Cost

Cost estimates (on-chain and off-chain) For the purpose of this proposal, we assume that all transactions are on chain (posts, replies, and votes).
On the Dragonchain network, transaction costs are deterministic/predictable. By staking TIME on the business node (as described above) Reddit can reduce transaction costs to as low as $0.0000025 per transaction.
Dragonchain Fees Table

Getting Started

How to run it
Building on Dragonchain is simple and requires no blockchain experience. Spin up a business node (L1) in our managed environment (AWS), run it in your own cloud environment, or on-prem in your own datacenter. Clear documentation will walk you through the steps of spinning up your first Dragonchain Level 1 Business node.
Getting started is easy...
  1. Download Dragonchain’s dctl
  2. Input three commands into a terminal
  3. Build an image
  4. Run it
More information can be found in our Get started documents.

Architecture
Dragonchain is an open source hybrid platform. Through Dragon Net, each chain combines the power of a public blockchain (like Ethereum) with the privacy of a private blockchain.
Dragonchain organizes its network into five separate levels. A Level 1, or business node, is a totally private blockchain only accessible through the use of public/private keypairs. All business logic, including smart contracts, can be executed on this node directly and added to the chain.
After creating a block, the Level 1 business node broadcasts a version stripped of sensitive private data to Dragon Net. Three Level 2 Validating nodes validate the transaction based on guidelines determined from the business. A Level 3 Diversity node checks that the level 2 nodes are from a diverse array of locations. A Level 4 Notary node, hosted by a KYC partner, then signs the validation record received from the Level 3 node. The transaction hash is ledgered to the Level 5 public chain to take advantage of the hash power of massive public networks.
Dragon Net can be thought of as a “blockchain of blockchains”, where every level is a complete private blockchain. Because an L1 can send to multiple nodes on a single level, proof of existence is distributed among many places in the network. Eventually, proof of existence reaches level 5 and is published on a public network.

API Documentation

APIs (on chain & off)

SDK Source

Nobody’s Perfect

Known issues or tradeoffs
  • Dragonchain is open source and even though the platform is easy enough for developers to code in any language they are comfortable with, we do not have so large a developer community as Ethereum. We would like to see the Ethereum developer community (and any other communities) become familiar with our SDK’s, our solutions, and our platform, to unlock the full potential of our Ethereum Interchain. Long ago we decided to prioritize both Bitcoin and Ethereum Interchains. We envision an ecosystem that encompasses different projects to give developers the ability to take full advantage of all the opportunities blockchain offers to create decentralized solutions not only for Reddit but for all of our current platforms and systems. We believe that together we will take the adoption of blockchain further. We currently have additional Interchain with Ethereum Classic. We look forward to Interchain with other blockchains in the future. We invite all blockchains projects who believe in decentralization and security to Interchain with Dragonchain.
  • While we only have 700 nodes compared to 8,000 Ethereum and 10,000 Bitcoin nodes. We harness those 18,000 nodes to scale to extremely high levels of security. See Dragonchain metrics.
  • Some may consider the centralization of Dragonchain’s business nodes as an issue at first glance, however, the model is by design to protect business data. We do not consider this a drawback as these nodes can make any, none, or all data public. Depending upon the implementation, every subreddit could have control of its own business node, for potential business and enterprise offerings, bringing new alternative revenue streams to Reddit.

Costs and resources

Summary of cost & resource information for both on-chain & off-chain components used in the PoC, as well as cost & resource estimates for further scaling. If your PoC is not on mainnet, make note of any mainnet caveats (such as congestion issues).
Every transaction on the PoC system had a transaction fee of $0.0001 (one-hundredth of a cent USD). At 256MM transactions, the demo cost $25,600. With current operational fees, the same demonstration would cost $640 USD.
For the demonstration, to achieve throughput to mimic a worldwide payments network, we modeled several clients in AWS and 4-5 business nodes to handle the traffic. The business nodes were tuned to handle higher throughput by adjusting memory and machine footprint on AWS. This flexibility is valuable to implementing a system such as envisioned by Reddit. Given that Reddit’s daily traffic (posts, replies, and votes) is less than half that of our demo, we would expect that the entire Reddit system could be handled on 2-5 business nodes using right-sized containers on AWS or similar environments.
Verification was accomplished on the operational Dragon Net network with over 700 independently owned verification nodes running around the world at no cost to the business other than paid transaction fees.

Requirements

Scaling

This PoC should scale to the numbers below with minimal costs (both on & off-chain). There should also be a clear path to supporting hundreds of millions of users.
Over a 5 day period, your scaling PoC should be able to handle:
*100,000 point claims (minting & distributing points) *25,000 subscriptions *75,000 one-off points burning *100,000 transfers
During Dragonchain’s 24 hour demo, the above required numbers were reached within the first few minutes.
Reddit’s total activity is 9000% more than Ethereum’s total transaction level. Even if you do not include votes, it is still 700% more than Ethereum’s current volume. Dragonchain has demonstrated that it can handle 250 million transactions a day, and it’s architecture allows for multiple systems to work at that level simultaneously. In our PoC, we demonstrate double the full capacity of Reddit, and every transaction was proven all the way to Bitcoin and Ethereum.
Reddit Scaling on Ethereum

Decentralization

Solutions should not depend on any single third-party provider. We prefer solutions that do not depend on specific entities such as Reddit or another provider, and solutions with no single point of control or failure in off-chain components but recognize there are numerous trade-offs to consider
Dragonchain’s architecture calls for a hybrid approach. Private business nodes hold the sensitive data while the validation and verification of transactions for the business are decentralized within seconds and secured to public blockchains within 10 minutes to 2 hours. Nodes could potentially be controlled by owners of individual subreddits for more organic decentralization.
  • Billing is currently centralized - there is a path to federation and decentralization of a scaled billing solution.
  • Operational multi-cloud
  • Operational on-premises capabilities
  • Operational deployment to any datacenter
  • Over 700 independent Community Verification Nodes with proof of ownership
  • Operational Interchain (Interoperable to Bitcoin, Ethereum, and Ethereum Classic, open to more)

Usability Scaling solutions should have a simple end user experience.

Users shouldn't have to maintain any extra state/proofs, regularly monitor activity, keep track of extra keys, or sign anything other than their normal transactions
Dragonchain and its customers have demonstrated extraordinary usability as a feature in many applications, where users do not need to know that the system is backed by a live blockchain. Lyceum is one of these examples, where the progress of academy courses is being tracked, and successful completion of courses is rewarded with certificates on chain. Our @Save_The_Tweet bot is popular on Twitter. When used with one of the following hashtags - #please, #blockchain, #ThankYou, or #eternalize the tweet is saved through Eternal to multiple blockchains. A proof report is available for future reference. Other examples in use are DEN, our decentralized social media platform, and our console, where users can track their node rewards, view their TIME, and operate a business node.
Examples:

Transactions complete in a reasonable amount of time (seconds or minutes, not hours or days)
All transactions are immediately usable on chain by the system. A transaction begins the path to decentralization at the conclusion of a 5-second block when it gets distributed across 5 separate community run nodes. Full decentralization occurs within 10 minutes to 2 hours depending on which interchain (Bitcoin, Ethereum, or Ethereum Classic) the transaction hits first. Within approximately 2 hours, the combined hash power of all interchained blockchains secures the transaction.

Free to use for end users (no gas fees, or fixed/minimal fees that Reddit can pay on their behalf)
With transaction pricing as low as $0.0000025 per transaction, it may be considered reasonable for Reddit to cover transaction fees for users.
All of Reddit's Transactions on Blockchain (month)
Community points can be earned by users and distributed directly to their Reddit account in batch (as per Reddit minting plan), and allow users to withdraw rewards to their Ethereum wallet whenever they wish. Withdrawal fees can be paid by either user or Reddit. This model has been operating inside the Dragonchain system since 2018, and many security and financial compliance features can be optionally added. We feel that this capability greatly enhances user experience because it is seamless to a regular user without cryptocurrency experience, yet flexible to a tech savvy user. With regard to currency or token transactions, these would occur on the Reddit network, verified to BTC and ETH. These transactions would incur the $0.0000025 transaction fee. To estimate this fee we use the monthly active Reddit users statista with a 60% adoption rate and an estimated 10 transactions per month average resulting in an approximate $720 cost across the system. Reddit could feasibly incur all associated internal network charges (mining/minting, transfer, burn) as these are very low and controllable fees.
Reddit Internal Token Transaction Fees

Reddit Ethereum Token Transaction Fees
When we consider further the Ethereum fees that might be incurred, we have a few choices for a solution.
  1. Offload all Ethereum transaction fees (user withdrawals) to interested users as they wish to withdraw tokens for external use or sale.
  2. Cover Ethereum transaction fees by aggregating them on a timed schedule. Users would request withdrawal (from Reddit or individual subreddits), and they would be transacted on the Ethereum network every hour (or some other schedule).
  3. In a combination of the above, customers could cover aggregated fees.
  4. Integrate with alternate Ethereum roll up solutions or other proposals to aggregate minting and distribution transactions onto Ethereum.

Bonus Points

Users should be able to view their balances & transactions via a blockchain explorer-style interface
From interfaces for users who have no knowledge of blockchain technology to users who are well versed in blockchain terms such as those present in a typical block explorer, a system powered by Dragonchain has flexibility on how to provide balances and transaction data to users. Transactions can be made viewable in an Eternal Proof Report, which displays raw data along with TIME staking information and traceability all the way to Bitcoin, Ethereum, and every other Interchained network. The report shows fields such as transaction ID, timestamp, block ID, multiple verifications, and Interchain proof. See example here.
Node payouts within the Dragonchain console are listed in chronological order and can be further seen in either Dragons or USD. See example here.
In our social media platform, Dragon Den, users can see, in real-time, their NRG and MTR balances. See example here.
A new influencer app powered by Dragonchain, Raiinmaker, breaks down data into a user friendly interface that shows coin portfolio, redeemed rewards, and social scores per campaign. See example here.

Exiting is fast & simple
Withdrawing funds on Dragonchain’s console requires three clicks, however, withdrawal scenarios with more enhanced security features per Reddit’s discretion are obtainable.

Interoperability Compatibility with third party apps (wallets/contracts/etc) is necessary.
Proven interoperability at scale that surpasses the required specifications. Our entire platform consists of interoperable blockchains connected to each other and traditional systems. APIs are well documented. Third party permissions are possible with a simple smart contract without the end user being aware. No need to learn any specialized proprietary language. Any code base (not subsets) is usable within a Docker container. Interoperable with any blockchain or traditional APIs. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js. Please see our source code and API documentation.

Scaling solutions should be extensible and allow third parties to build on top of it Open source and extensible
APIs should be well documented and stable

Documentation should be clear and complete
For full documentation, explore our docs, SDK’s, Github repo’s, architecture documents, original Disney documentation, and other links or resources provided in this proposal.

Third-party permissionless integrations should be possible & straightforward Smart contracts are Docker based, can be written in any language, use full language (not subsets), and can therefore be integrated with any system including traditional system APIs. Simple is better. Learning an uncommon or proprietary language should not be necessary.
Advanced knowledge of mathematics, cryptography, or L2 scaling should not be required. Compatibility with common utilities & toolchains is expected.
Dragonchain business nodes and smart contracts leverage Docker to allow the use of literally any language or executable code. No proprietary language is necessary. We’ve witnessed relatively complex systems built by engineers with no blockchain or cryptocurrency experience. We’ve also demonstrated the creation of smart contracts within minutes built with BASH shell and Node.js.

Bonus

Bonus Points: Show us how it works. Do you have an idea for a cool new use case for Community Points? Build it!

TIME

Community points could be awarded to Reddit users based upon TIME too, whereas the longer someone is part of a subreddit, the more community points someone naturally gained, even if not actively commenting or sharing new posts. A daily login could be required for these community points to be credited. This grants awards to readers too and incentivizes readers to create an account on Reddit if they browse the website often. This concept could also be leveraged to provide some level of reputation based upon duration and consistency of contribution to a community subreddit.

Dragon Den

Dragonchain has already built a social media platform that harnesses community involvement. Dragon Den is a decentralized community built on the Dragonchain blockchain platform. Dragon Den is Dragonchain’s answer to fake news, trolling, and censorship. It incentivizes the creation and evaluation of quality content within communities. It could be described as being a shareholder of a subreddit or Reddit in its entirety. The more your subreddit is thriving, the more rewarding it will be. Den is currently in a public beta and in active development, though the real token economy is not live yet. There are different tokens for various purposes. Two tokens are Lair Ownership Rights (LOR) and Lair Ownership Tokens (LOT). LOT is a non-fungible token for ownership of a specific Lair. LOT will only be created and converted from LOR.
Energy (NRG) and Matter (MTR) work jointly. Your MTR determines how much NRG you receive in a 24-hour period. Providing quality content, or evaluating content will earn MTR.

Security. Users have full ownership & control of their points.
All community points awarded based upon any type of activity or gift, are secured and provable to all Interchain networks (currently BTC, ETH, ETC). Users are free to spend and withdraw their points as they please, depending on the features Reddit wants to bring into production.

Balances and transactions cannot be forged, manipulated, or blocked by Reddit or anyone else
Users can withdraw their balance to their ERC20 wallet, directly through Reddit. Reddit can cover the fees on their behalf, or the user covers this with a portion of their balance.

Users should own their points and be able to get on-chain ERC20 tokens without permission from anyone else
Through our console users can withdraw their ERC20 rewards. This can be achieved on Reddit too. Here is a walkthrough of our console, though this does not show the quick withdrawal functionality, a user can withdraw at any time. https://www.youtube.com/watch?v=aNlTMxnfVHw

Points should be recoverable to on-chain ERC20 tokens even if all third-parties involved go offline
If necessary, signed transactions from the Reddit system (e.g. Reddit + Subreddit) can be sent to the Ethereum smart contract for minting.

A public, third-party review attesting to the soundness of the design should be available
To our knowledge, at least two large corporations, including a top 3 accounting firm, have conducted positive reviews. These reviews have never been made public, as Dragonchain did not pay or contract for these studies to be released.

Bonus points
Public, third-party implementation review available or in progress
See above

Compatibility with HSMs & hardware wallets
For the purpose of this proposal, all tokenization would be on the Ethereum network using standard token contracts and as such, would be able to leverage all hardware wallet and Ethereum ecosystem services.

Other Considerations

Minting/distributing tokens is not performed by Reddit directly
This operation can be automated by smart contract on Ethereum. Subreddits can if desired have a role to play.

One off point burning, as well as recurring, non-interactive point burning (for subreddit memberships) should be possible and scalable
This is possible and scalable with interaction between Dragonchain Reddit system and Ethereum token contract(s).

Fully open-source solutions are strongly preferred
Dragonchain is fully open source (see section on Disney release after conclusion).

Conclusion

Whether it is today, or in the future, we would like to work together to bring secure flexibility to the highest standards. It is our hope to be considered by Ethereum, Reddit, and other integrative solutions so we may further discuss the possibilities of implementation. In our public demonstration, 256 million transactions were handled in our operational network on chain in 24 hours, for the low cost of $25K, which if run today would cost $625. Dragonchain’s interoperable foundation provides the atmosphere necessary to implement a frictionless community points system. Thank you for your consideration of our proposal. We look forward to working with the community to make something great!

Disney Releases Blockchain Platform as Open Source

The team at Disney created the Disney Private Blockchain Platform. The system was a hybrid interoperable blockchain platform for ledgering and smart contract development geared toward solving problems with blockchain adoption and usability. All objective evaluation would consider the team’s output a success. We released a list of use cases that we explored in some capacity at Disney, and our input on blockchain standardization as part of our participation in the W3C Blockchain Community Group.
https://lists.w3.org/Archives/Public/public-blockchain/2016May/0052.html

Open Source

In 2016, Roets proposed to release the platform as open source to spread the technology outside of Disney, as others within the W3C group were interested in the solutions that had been created inside of Disney.
Following a long process, step by step, the team met requirements for release. Among the requirements, the team had to:
  • Obtain VP support and approval for the release
  • Verify ownership of the software to be released
  • Verify that no proprietary content would be released
  • Convince the organization that there was a value to the open source community
  • Convince the organization that there was a value to Disney
  • Offer the plan for ongoing maintenance of the project outside of Disney
  • Itemize competing projects
  • Verify no conflict of interest
  • Preferred license
  • Change the project name to not use the name Disney, any Disney character, or any other associated IP - proposed Dragonchain - approved
  • Obtain legal approval
  • Approval from corporate, parks, and other business units
  • Approval from multiple Disney patent groups Copyright holder defined by Disney (Disney Connected and Advanced Technologies)
  • Trademark searches conducted for the selected name Dragonchain
  • Obtain IT security approval
  • Manual review of OSS components conducted
  • OWASP Dependency and Vulnerability Check Conducted
  • Obtain technical (software) approval
  • Offer management, process, and financial plans for the maintenance of the project.
  • Meet list of items to be addressed before release
  • Remove all Disney project references and scripts
  • Create a public distribution list for email communications
  • Remove Roets’ direct and internal contact information
  • Create public Slack channel and move from Disney slack channels
  • Create proper labels for issue tracking
  • Rename internal private Github repository
  • Add informative description to Github page
  • Expand README.md with more specific information
  • Add information beyond current “Blockchains are Magic”
  • Add getting started sections and info on cloning/forking the project
  • Add installation details
  • Add uninstall process
  • Add unit, functional, and integration test information
  • Detail how to contribute and get involved
  • Describe the git workflow that the project will use
  • Move to public, non-Disney git repository (Github or Bitbucket)
  • Obtain Disney Open Source Committee approval for release
On top of meeting the above criteria, as part of the process, the maintainer of the project had to receive the codebase on their own personal email and create accounts for maintenance (e.g. Github) with non-Disney accounts. Given the fact that the project spanned multiple business units, Roets was individually responsible for its ongoing maintenance. Because of this, he proposed in the open source application to create a non-profit organization to hold the IP and maintain the project. This was approved by Disney.
The Disney Open Source Committee approved the application known as OSSRELEASE-10, and the code was released on October 2, 2016. Disney decided to not issue a press release.
Original OSSRELASE-10 document

Dragonchain Foundation

The Dragonchain Foundation was created on January 17, 2017. https://den.social/l/Dragonchain/24130078352e485d96d2125082151cf0/dragonchain-and-disney/
submitted by j0j0r0 to ethereum [link] [comments]

Gridcoin 5.0.0.0-Mandatory "Fern" Release

https://github.com/gridcoin-community/Gridcoin-Research/releases/tag/5.0.0.0
Finally! After over ten months of development and testing, "Fern" has arrived! This is a whopper. 240 pull requests merged. Essentially a complete rewrite that was started with the scraper (the "neural net" rewrite) in "Denise" has now been completed. Practically the ENTIRE Gridcoin specific codebase resting on top of the vanilla Bitcoin/Peercoin/Blackcoin vanilla PoS code has been rewritten. This removes the team requirement at last (see below), although there are many other important improvements besides that.
Fern was a monumental undertaking. We had to encode all of the old rules active for the v10 block protocol in new code and ensure that the new code was 100% compatible. This had to be done in such a way as to clear out all of the old spaghetti and ring-fence it with tightly controlled class implementations. We then wrote an entirely new, simplified ruleset for research rewards and reengineered contracts (which includes beacon management, polls, and voting) using properly classed code. The fundamentals of Gridcoin with this release are now on a very sound and maintainable footing, and the developers believe the codebase as updated here will serve as the fundamental basis for Gridcoin's future roadmap.
We have been testing this for MONTHS on testnet in various stages. The v10 (legacy) compatibility code has been running on testnet continuously as it was developed to ensure compatibility with existing nodes. During the last few months, we have done two private testnet forks and then the full public testnet testing for v11 code (the new protocol which is what Fern implements). The developers have also been running non-staking "sentinel" nodes on mainnet with this code to verify that the consensus rules are problem-free for the legacy compatibility code on the broader mainnet. We believe this amount of testing is going to result in a smooth rollout.
Given the amount of changes in Fern, I am presenting TWO changelogs below. One is high level, which summarizes the most significant changes in the protocol. The second changelog is the detailed one in the usual format, and gives you an inkling of the size of this release.

Highlights

Protocol

Note that the protocol changes will not become active until we cross the hard-fork transition height to v11, which has been set at 2053000. Given current average block spacing, this should happen around October 4, about one month from now.
Note that to get all of the beacons in the network on the new protocol, we are requiring ALL beacons to be validated. A two week (14 day) grace period is provided by the code, starting at the time of the transition height, for people currently holding a beacon to validate the beacon and prevent it from expiring. That means that EVERY CRUNCHER must advertise and validate their beacon AFTER the v11 transition (around Oct 4th) and BEFORE October 18th (or more precisely, 14 days from the actual date of the v11 transition). If you do not advertise and validate your beacon by this time, your beacon will expire and you will stop earning research rewards until you advertise and validate a new beacon. This process has been made much easier by a brand new beacon "wizard" that helps manage beacon advertisements and renewals. Once a beacon has been validated and is a v11 protocol beacon, the normal 180 day expiration rules apply. Note, however, that the 180 day expiration on research rewards has been removed with the Fern update. This means that while your beacon might expire after 180 days, your earned research rewards will be retained and can be claimed by advertising a beacon with the same CPID and going through the validation process again. In other words, you do not lose any earned research rewards if you do not stake a block within 180 days and keep your beacon up-to-date.
The transition height is also when the team requirement will be relaxed for the network.

GUI

Besides the beacon wizard, there are a number of improvements to the GUI, including new UI transaction types (and icons) for staking the superblock, sidestake sends, beacon advertisement, voting, poll creation, and transactions with a message. The main screen has been revamped with a better summary section, and better status icons. Several changes under the hood have improved GUI performance. And finally, the diagnostics have been revamped.

Blockchain

The wallet sync speed has been DRASTICALLY improved. A decent machine with a good network connection should be able to sync the entire mainnet blockchain in less than 4 hours. A fast machine with a really fast network connection and a good SSD can do it in about 2.5 hours. One of our goals was to reduce or eliminate the reliance on snapshots for mainnet, and I think we have accomplished that goal with the new sync speed. We have also streamlined the in-memory structures for the blockchain which shaves some memory use.
There are so many goodies here it is hard to summarize them all.
I would like to thank all of the contributors to this release, but especially thank @cyrossignol, whose incredible contributions formed the backbone of this release. I would also like to pay special thanks to @barton2526, @caraka, and @Quezacoatl1, who tirelessly helped during the testing and polishing phase on testnet with testing and repeated builds for all architectures.
The developers are proud to present this release to the community and we believe this represents the starting point for a true renaissance for Gridcoin!

Summary Changelog

Accrual

Changed

Most significantly, nodes calculate research rewards directly from the magnitudes in EACH superblock between stakes instead of using a two- or three- point average based on a CPID's current magnitude and the magnitude for the CPID when it last staked. For those long-timers in the community, this has been referred to as "Superblock Windows," and was first done in proof-of-concept form by @denravonska.

Removed

Beacons

Added

Changed

Removed

Unaltered

As a reminder:

Superblocks

Added

Changed

Removed

Voting

Added

Changed

Removed

Detailed Changelog

[5.0.0.0] 2020-09-03, mandatory, "Fern"

Added

Changed

Removed

Fixed

submitted by jamescowens to gridcoin [link] [comments]

Zano Newcomers Introduction/FAQ - please read!

Welcome to the Zano Sticky Introduction/FAQ!

https://preview.redd.it/al1gy9t9v9q51.png?width=424&format=png&auto=webp&s=b29a60402d30576a4fd95f592b392fae202026ca
Hopefully any questions you have will be answered by the resources below, but if you have additional questions feel free to ask them in the comments. If you're quite technically-minded, the Zano whitepaper gives a thorough overview of Zano's design and its main features.
So, what is Zano? In brief, Zano is a project started by the original developers of CryptoNote. Coins with market caps totalling well over a billion dollars (Monero, Haven, Loki and countless others) run upon the codebase they created. Zano is a continuation of their efforts to create the "perfect money", and brings a wealth of enhancements to their original CryptoNote code.
Development happens at a lightning pace, as the Github activity shows, but Zano is still very much a work-in-progress. Let's cut right to it:
Here's why you should pay attention to Zano over the next 12-18 months. Quoting from a recent update:
Anton Sokolov has recently joined the Zano team. ... For the last months Anton has been working on theoretical work dedicated to log-size ring signatures. These signatures theoretically allows for a logarithmic relationship between the number of decoys and the size/performance of transactions. This means that we can set mixins at a level from up to 1000, keeping the reasonable size and processing speed of transactions. This will take Zano’s privacy to a whole new level, and we believe this technology will turn out to be groundbreaking!
If successful, this scheme will make Zano the most private, powerful and performant CryptoNote implementation on the planet. Bar none. A quantum leap in privacy with a minimal increase in resource usage. And if there's one team capable of pulling it off, it's this one.

What else makes Zano special?

You mean aside from having "the Godfather of CryptoNote" as the project lead? ;) Actually, the calibre of the developers/researchers at Zano probably is the project's single greatest strength. Drawing on years of experience, they've made careful design choices, optimizing performance with an asynchronous core architecture, and flexibility and extensibility with a modular code structure. This means that the developers are able to build and iterate fast, refining features and adding new ones at a rate that makes bigger and better-funded teams look sluggish at best.
Zano also has some unique features that set it apart from similar projects:
Privacy Firstly, if you're familiar with CryptoNote you won't be surprised that Zano transactions are private. The perfect money is fungible, and therefore must be untraceable. Bitcoin, for the most part, does little to hide your transaction data from unscrupulous observers. With Zano, privacy is the default.
The untraceability and unlinkability of Zano transactions come from its use of ring signatures and stealth addresses. What this means is that no outside observer is able to tell if two transactions were sent to the same address, and for each transaction there is a set of possible senders that make it impossible to determine who the real sender is.
Hybrid PoW-PoS consensus mechanism Zano achieves an optimal level of security by utilizing both Proof of Work and Proof of Stake for consensus. By combining the two systems, it mitigates their individual vulnerabilities (see 51% attack and "nothing at stake" problem). For an attack on Zano to have even a remote chance of success the attacker would have to obtain not only a majority of hashing power, but also a majority of the coins involved in staking. The system and its design considerations are discussed at length in the whitepaper.
Aliases Here's a stealth address: ZxDdULdxC7NRFYhCGdxkcTZoEGQoqvbZqcDHj5a7Gad8Y8wZKAGZZmVCUf9AvSPNMK68L8r8JfAfxP4z1GcFQVCS2Jb9wVzoe. I have a hard enough time remembering my phone number. Fortunately, Zano has an alias system that lets you register an address to a human-readable name. (@orsonj if you want to anonymously buy me a coffee)
Multisig
Multisignature (multisig) refers to requiring multiple keys to authorize a Zano transaction. It has a number of applications, such as dividing up responsibility for a single Zano wallet among multiple parties, or creating backups where loss of a single seed doesn't lead to loss of the wallet.
Multisig and escrow are key components of the planned Decentralized Marketplace (see below), so consideration was given to each of them from the design stages. Thus Zano's multisig, rather than being tagged on at the wallet-level as an afterthought, is part of its its core architecture being incorporated at the protocol level. This base-layer integration means months won't be spent in the future on complicated refactoring efforts in order to integrate multisig into a codebase that wasn't designed for it. Plus, it makes it far easier for third-party developers to include multisig (implemented correctly) in any Zano wallets and applications they create in the future.
(Double Deposit MAD) Escrow
With Zano's escrow service you can create fully customizable p2p contracts that are designed to, once signed by participants, enforce adherence to their conditions in such a way that no trusted third-party escrow agent is required.
https://preview.redd.it/jp4oghyhv9q51.png?width=1762&format=png&auto=webp&s=12a1e76f76f902ed328886283050e416db3838a5
The Particl project, aside from a couple of minor differences, uses an escrow scheme that works the same way, so I've borrowed the term they coined ("Double Deposit MAD Escrow") as I think it describes the scheme perfectly. The system requires participants to make additional deposits, which they will forfeit if there is any attempt to act in a way that breaches the terms of the contract. Full details can be found in the Escrow section of the whitepaper.
The usefulness of multisig and the escrow system may not seem obvious at first, but as mentioned before they'll form the backbone of Zano's Decentralized Marketplace service (described in the next section).

What does the future hold for Zano?

The planned upgrade to Zano's privacy, mentioned at the start, is obviously one of the most exciting things the team is working on, but it's not the only thing.
Zano Roadmap
Decentralized Marketplace
From the beginning, the Zano team's goal has been to create the perfect money. And money can't just be some vehicle for speculative investment, money must be used. To that end, the team have created a set of tools to make it as simple as possible for Zano to be integrated into eCommerce platforms. Zano's API’s and plugins are easy to use, allowing even those with very little coding experience to use them in their E-commerce-related ventures. The culmination of this effort will be a full Decentralized Anonymous Marketplace built on top of the Zano blockchain. Rather than being accessed via the wallet, it will act more as a service - Marketplace as a Service (MAAS) - for anyone who wishes to use it. The inclusion of a simple "snippet" of code into a website is all that's needed to become part a global decentralized, trustless and private E-commerce network.
Atomic Swaps
Just as Zano's marketplace will allow you to transact without needing to trust your counterparty, atomic swaps will let you to easily convert between Zano and other cyryptocurrencies without having to trust a third-party service such as a centralized exchange. On top of that, it will also lead to the way to Zano's inclusion in the many decentralized exchange (DEX) services that have emerged in recent years.

Where can I buy Zano?

Zano's currently listed on the following exchanges:
https://coinmarketcap.com/currencies/zano/markets/
It goes without saying, neither I nor the Zano team work for any of the exchanges or can vouch for their reliability. Use at your own risk and never leave coins on a centralized exchange for longer than necessary. Your keys, your coins!
If you have any old graphics cards lying around(both AMD & NVIDIA), then Zano is also mineable through its unique ProgPowZ algorithm. Here's a guide on how to get started.
Once you have some Zano, you can safely store it in one of the desktop or mobile wallets (available for all major platforms).

How can I support Zano?

Zano has no marketing department, which is why this post has been written by some guy and not the "Chief Growth Engineer @ Zano Enterprises". The hard part is already done: there's a team of world class developers and researchers gathered here. But, at least at the current prices, the team's funds are enough to cover the cost of development and little more. So the job of publicizing the project falls to the community. If you have any experience in community building/growth hacking at another cryptocurrency or open source project, or if you're a Zano holder who would like to ensure the project's long-term success by helping to spread the word, then send me a pm. We need to get organized.
Researchers and developers are also very welcome. Working at the cutting edge of mathematics and cryptography means Zano provides challenging and rewarding work for anyone in those fields. Please contact the project's Community Manager u/Jed_T if you're interested in joining the team.
Social Links:
Twitter
Discord Server
Telegram Group
Medium blog
I'll do my best to keep this post accurate and up to date. Message me please with any suggested improvements and leave any questions you have below.
Welcome to the Zano community and the new decentralized private economy!
submitted by OrsonJ to Zano [link] [comments]

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake

Scaling Reddit Community Points with Arbitrum Rollup: a piece of cake
https://preview.redd.it/b80c05tnb9e51.jpg?width=2550&format=pjpg&auto=webp&s=850282c1a3962466ed44f73886dae1c8872d0f31
Submitted for consideration to The Great Reddit Scaling Bake-Off
Baked by the pastry chefs at Offchain Labs
Please send questions or comments to [[email protected] ](mailto:[email protected])
1. Overview
We're excited to submit Arbitrum Rollup for consideration to The Great Reddit Scaling Bake-Off. Arbitrum Rollup is the only Ethereum scaling solution that supports arbitrary smart contracts without compromising on Ethereum's security or adding points of centralization. For Reddit, this means that Arbitrum can not only scale the minting and transfer of Community Points, but it can foster a creative ecosystem built around Reddit Community Points enabling points to be used in a wide variety of third party applications. That's right -- you can have your cake and eat it too!
Arbitrum Rollup isn't just Ethereum-style. Its Layer 2 transactions are byte-for-byte identical to Ethereum, which means Ethereum users can continue to use their existing addresses and wallets, and Ethereum developers can continue to use their favorite toolchains and development environments out-of-the-box with Arbitrum. Coupling Arbitrum’s tooling-compatibility with its trustless asset interoperability, Reddit not only can scale but can onboard the entire Ethereum community at no cost by giving them the same experience they already know and love (well, certainly know).
To benchmark how Arbitrum can scale Reddit Community Points, we launched the Reddit contracts on an Arbitrum Rollup chain. Since Arbitrum provides full Solidity support, we didn't have to rewrite the Reddit contracts or try to mimic their functionality using an unfamiliar paradigm. Nope, none of that. We launched the Reddit contracts unmodified on Arbitrum Rollup complete with support for minting and distributing points. Like every Arbitrum Rollup chain, the chain included a bridge interface in which users can transfer Community Points or any other asset between the L1 and L2 chains. Arbitrum Rollup chains also support dynamic contract loading, which would allow third-party developers to launch custom ecosystem apps that integrate with Community Points on the very same chain that runs the Reddit contracts.
1.1 Why Ethereum
Perhaps the most exciting benefit of distributing Community Points using a blockchain is the ability to seamlessly port points to other applications and use them in a wide variety of contexts. Applications may include simple transfers such as a restaurant that allows Redditors to spend points on drinks. Or it may include complex smart contracts -- such as placing Community Points as a wager for a multiparty game or as collateral in a financial contract.
The common denominator between all of the fun uses of Reddit points is that it needs a thriving ecosystem of both users and developers, and the Ethereum blockchain is perhaps the only smart contract platform with significant adoption today. While many Layer 1 blockchains boast lower cost or higher throughput than the Ethereum blockchain, more often than not, these attributes mask the reality of little usage, weaker security, or both.
Perhaps another platform with significant usage will rise in the future. But today, Ethereum captures the mindshare of the blockchain community, and for Community Points to provide the most utility, the Ethereum blockchain is the natural choice.
1.2 Why Arbitrum
While Ethereum's ecosystem is unmatched, the reality is that fees are high and capacity is too low to support the scale of Reddit Community Points. Enter Arbitrum. Arbitrum Rollup provides all of the ecosystem benefits of Ethereum, but with orders of magnitude more capacity and at a fraction of the cost of native Ethereum smart contracts. And most of all, we don't change the experience from users. They continue to use the same wallets, addresses, languages, and tools.
Arbitrum Rollup is not the only solution that can scale payments, but it is the only developed solution that can scale both payments and arbitrary smart contracts trustlessly, which means that third party users can build highly scalable add-on apps that can be used without withdrawing money from the Rollup chain. If you believe that Reddit users will want to use their Community Points in smart contracts--and we believe they will--then it makes the most sense to choose a single scaling solution that can support the entire ecosystem, eliminating friction for users.
We view being able to run smart contracts in the same scaling solution as fundamentally critical since if there's significant demand in running smart contracts from Reddit's ecosystem, this would be a load on Ethereum and would itself require a scaling solution. Moreover, having different scaling solutions for the minting/distribution/spending of points and for third party apps would be burdensome for users as they'd have to constantly shuffle their Points back and forth.
2. Arbitrum at a glance
Arbitrum Rollup has a unique value proposition as it offers a combination of features that no other scaling solution achieves. Here we highlight its core attributes.
Decentralized. Arbitrum Rollup is as decentralized as Ethereum. Unlike some other Layer 2 scaling projects, Arbitrum Rollup doesn't have any centralized components or centralized operators who can censor users or delay transactions. Even in non-custodial systems, centralized components provide a risk as the operators are generally incentivized to increase their profit by extracting rent from users often in ways that severely degrade user experience. Even if centralized operators are altruistic, centralized components are subject to hacking, coercion, and potential liability.
Massive Scaling. Arbitrum achieves order of magnitude scaling over Ethereum's L1 smart contracts. Our software currently supports 453 transactions-per-second for basic transactions (at 1616 Ethereum gas per tx). We have a lot of room left to optimize (e.g. aggregating signatures), and over the next several months capacity will increase significantly. As described in detail below, Arbitrum can easily support and surpass Reddit's anticipated initial load, and its capacity will continue to improve as Reddit's capacity needs grow.
Low cost. The cost of running Arbitrum Rollup is quite low compared to L1 Ethereum and other scaling solutions such as those based on zero-knowledge proofs. Layer 2 fees are low, fixed, and predictable and should not be overly burdensome for Reddit to cover. Nobody needs to use special equipment or high-end machines. Arbitrum requires validators, which is a permissionless role that can be run on any reasonable on-line machine. Although anybody can act as a validator, in order to protect against a “tragedy of the commons” and make sure reputable validators are participating, we support a notion of “invited validators” that are compensated for their costs. In general, users pay (low) fees to cover the invited validators’ costs, but we imagine that Reddit may cover this cost for its users. See more on the costs and validator options below.
Ethereum Developer Experience. Not only does Arbitrum support EVM smart contracts, but the developer experience is identical to that of L1 Ethereum contracts and fully compatible with Ethereum tooling. Developers can port existing Solidity apps or write new ones using their favorite and familiar toolchains (e.g. Truffle, Buidler). There are no new languages or coding paradigms to learn.
Ethereum wallet compatibility. Just as in Ethereum, Arbitrum users need only hold keys, but do not have to store any coin history or additional data to protect or access their funds. Since Arbitrum transactions are semantically identical to Ethereum L1 transactions, existing Ethereum users can use their existing Ethereum keys with their existing wallet software such as Metamask.
Token interoperability. Users can easily transfer their ETH, ERC-20 and ERC-721 tokens between Ethereum and the Arbitrum Rollup chain. As we explain in detail below, it is possible to mint tokens in L2 that can subsequently be withdrawn and recognized by the L1 token contract.
Fast finality. Transactions complete with the same finality time as Ethereum L1 (and it's possible to get faster finality guarantees by trading away trust assumptions; see the Arbitrum Rollup whitepaper for details).
Non-custodial. Arbitrum Rollup is a non-custodial scaling solution, so users control their funds/points and neither Reddit nor anyone else can ever access or revoke points held by users.
Censorship Resistant. Since it's completely decentralized, and the Arbitrum protocol guarantees progress trustlessly, Arbitrum Rollup is just as censorship-proof as Ethereum.
Block explorer. The Arbitrum Rollup block explorer allows users to view and analyze transactions on the Rollup chain.
Limitations
Although this is a bake-off, we're not going to sugar coat anything. Arbitrum Rollup, like any Optimistic Rollup protocol, does have one limitation, and that's the delay on withdrawals.
As for the concrete length of the delay, we've done a good deal of internal modeling and have blogged about this as well. Our current modeling suggests a 3-hour delay is sufficient (but as discussed in the linked post there is a tradeoff space between the length of the challenge period and the size of the validators’ deposit).
Note that this doesn't mean that the chain is delayed for three hours. Arbitrum Rollup supports pipelining of execution, which means that validators can keep building new states even while previous ones are “in the pipeline” for confirmation. As the challenge delays expire for each update, a new state will be confirmed (read more about this here).
So activity and progress on the chain are not delayed by the challenge period. The only thing that's delayed is the consummation of withdrawals. Recall though that any single honest validator knows immediately (at the speed of L1 finality) which state updates are correct and can guarantee that they will eventually be confirmed, so once a valid withdrawal has been requested on-chain, every honest party knows that the withdrawal will definitely happen. There's a natural place here for a liquidity market in which a validator (or someone who trusts a validator) can provide withdrawal loans for a small interest fee. This is a no-risk business for them as they know which withdrawals will be confirmed (and can force their confirmation trustlessly no matter what anyone else does) but are just waiting for on-chain finality.
3. The recipe: How Arbitrum Rollup works
For a description of the technical components of Arbitrum Rollup and how they interact to create a highly scalable protocol with a developer experience that is identical to Ethereum, please refer to the following documents:
Arbitrum Rollup Whitepaper
Arbitrum academic paper (describes a previous version of Arbitrum)
4. Developer docs and APIs
For full details about how to set up and interact with an Arbitrum Rollup chain or validator, please refer to our developer docs, which can be found at https://developer.offchainlabs.com/.
Note that the Arbitrum version described on that site is older and will soon be replaced by the version we are entering in Reddit Bake-Off, which is still undergoing internal testing before public release.
5. Who are the validators?
As with any Layer 2 protocol, advancing the protocol correctly requires at least one validator (sometimes called block producers) that is honest and available. A natural question is: who are the validators?
Recall that the validator set for an Arbitrum chain is open and permissionless; anyone can start or stop validating at will. (A useful analogy is to full nodes on an L1 chain.) But we understand that even though anyone can participate, Reddit may want to guarantee that highly reputable nodes are validating their chain. Reddit may choose to validate the chain themselves and/or hire third-party validators.To this end, we have begun building a marketplace for validator-for-hire services so that dapp developers can outsource validation services to reputable nodes with high up-time. We've announced a partnership in which Chainlink nodes will provide Arbitrum validation services, and we expect to announce more partnerships shortly with other blockchain infrastructure providers.
Although there is no requirement that validators are paid, Arbitrum’s economic model tracks validators’ costs (e.g. amount of computation and storage) and can charge small fees on user transactions, using a gas-type system, to cover those costs. Alternatively, a single party such as Reddit can agree to cover the costs of invited validators.
6. Reddit Contract Support
Since Arbitrum contracts and transactions are byte-for-byte compatible with Ethereum, supporting the Reddit contracts is as simple as launching them on an Arbitrum chain.
Minting. Arbitrum Rollup supports hybrid L1/L2 tokens which can be minted in L2 and then withdrawn onto the L1. An L1 contract at address A can make a special call to the EthBridge which deploys a "buddy contract" to the same address A on an Arbitrum chain. Since it's deployed at the same address, users can know that the L2 contract is the authorized "buddy" of the L1 contract on the Arbitrum chain.
For minting, the L1 contract is a standard ERC-20 contract which mints and burns tokens when requested by the L2 contract. It is paired with an ERC-20 contract in L2 which mints tokens based on whatever programmer provided minting facility is desired and burns tokens when they are withdrawn from the rollup chain. Given this base infrastructure, Arbitrum can support any smart contract based method for minting tokens in L2, and indeed we directly support Reddit's signature/claim based minting in L2.
Batch minting. What's better than a mint cookie? A whole batch! In addition to supporting Reddit’s current minting/claiming scheme, we built a second minting design, which we believe outperforms the signature/claim system in many scenarios.
In the current system, Reddit periodically issues signed statements to users, who then take those statements to the blockchain to claim their tokens. An alternative approach would have Reddit directly submit the list of users/amounts to the blockchain and distribute the tokens to the users without the signature/claim process.
To optimize the cost efficiency of this approach, we designed an application-specific compression scheme to minimize the size of the batch distribution list. We analyzed the data from Reddit's previous distributions and found that the data is highly compressible since token amounts are small and repeated, and addresses appear multiple times. Our function groups transactions by size, and replaces previously-seen addresses with a shorter index value. We wrote client code to compress the data, wrote a Solidity decompressing function, and integrated that function into Reddit’s contract running on Arbitrum.
When we ran the compression function on the previous Reddit distribution data, we found that we could compress batched minting data down to to 11.8 bytes per minting event (averaged over a 6-month trace of Reddit’s historical token grants)compared with roughly 174 bytes of on-chain data needed for the signature claim approach to minting (roughly 43 for an RLP-encoded null transaction + 65 for Reddit's signature + 65 for the user's signature + roughly 8 for the number of Points) .
The relative benefit of the two approaches with respect to on-chain call data cost depends on the percentage of users that will actually claim their tokens on chain. With the above figures, batch minting will be cheaper if roughly 5% of users redeem their claims. We stress that our compression scheme is not Arbitrum-specific and would be beneficial in any general-purpose smart contract platform.
8. Benchmarks and costs
In this section, we give the full costs of operating the Reddit contracts on an Arbitrum Rollup chain including the L1 gas costs for the Rollup chain, the costs of computation and storage for the L2 validators as well as the capital lockup requirements for staking.
Arbitrum Rollup is still on testnet, so we did not run mainnet benchmarks. Instead, we measured the L1 gas cost and L2 workload for Reddit operations on Arbitrum and calculated the total cost assuming current Ethereum gas prices. As noted below in detail, our measurements do not assume that Arbitrum is consuming the entire capacity of Ethereum. We will present the details of our model now, but for full transparency you can also play around with it yourself and adjust the parameters, by copying the spreadsheet found here.
Our cost model is based on measurements of Reddit’s contracts, running unmodified (except for the addition of a batch minting function) on Arbitrum Rollup on top of Ethereum.
On the distribution of transactions and frequency of assertions. Reddit's instructions specify the following minimum parameters that submissions should support:
Over a 5 day period, your scaling PoC should be able to handle:
  • 100,000 point claims (minting & distributing points)
  • 25,000 subscriptions
  • 75,000 one-off points burning
  • 100,000 transfers
We provide the full costs of operating an Arbitrum Rollup chain with this usage under the assumption that tokens are minted or granted to users in batches, but other transactions are uniformly distributed over the 5 day period. Unlike some other submissions, we do not make unrealistic assumptions that all operations can be submitted in enormous batches. We assume that batch minting is done in batches that use only a few percent on an L1 block’s gas, and that other operations come in evenly over time and are submitted in batches, with one batch every five minutes to keep latency reasonable. (Users are probably already waiting for L1 finality, which takes at least that long to achieve.)
We note that assuming that there are only 300,000 transactions that arrive uniformly over the 5 day period will make our benchmark numbers lower, but we believe that this will reflect the true cost of running the system. To see why, say that batches are submitted every five minutes (20 L1 blocks) and there's a fixed overhead of c bytes of calldata per batch, the cost of which will get amortized over all transactions executed in that batch. Assume that each individual transaction adds a marginal cost of t. Lastly assume the capacity of the scaling system is high enough that it can support all of Reddit's 300,000 transactions within a single 20-block batch (i.e. that there is more than c + 300,000*t byes of calldata available in 20 blocks).
Consider what happens if c, the per-batch overhead, is large (which it is in some systems, but not in Arbitrum). In the scenario that transactions actually arrive at the system's capacity and each batch is full, then c gets amortized over 300,000 transactions. But if we assume that the system is not running at capacity--and only receives 300,000 transactions arriving uniformly over 5 days-- then each 20-block assertion will contain about 200 transactions, and thus each transaction will pay a nontrivial cost due to c.
We are aware that other proposals presented scaling numbers assuming that 300,000 transactions arrived at maximum capacity and was executed in a single mega-transaction, but according to our estimates, for at least one such report, this led to a reported gas price that was 2-3 orders of magnitude lower than it would have been assuming uniform arrival. We make more realistic batching assumptions, and we believe Arbitrum compares well when batch sizes are realistic.
Our model. Our cost model includes several sources of cost:
  • L1 gas costs: This is the cost of posting transactions as calldata on the L1 chain, as well as the overhead associated with each batch of transactions, and the L1 cost of settling transactions in the Arbitrum protocol.
  • Validator’s staking costs: In normal operation, one validator will need to be staked. The stake is assumed to be 0.2% of the total value of the chain (which is assumed to be $1 per user who is eligible to claim points). The cost of staking is the interest that could be earned on the money if it were not staked.
  • Validator computation and storage: Every validator must do computation to track the chain’s processing of transactions, and must maintain storage to keep track of the contracts’ EVM storage. The cost of computation and storage are estimated based on measurements, with the dollar cost of resources based on Amazon Web Services pricing.
It’s clear from our modeling that the predominant cost is for L1 calldata. This will probably be true for any plausible rollup-based system.
Our model also shows that Arbitrum can scale to workloads much larger than Reddit’s nominal workload, without exhausting L1 or L2 resources. The scaling bottleneck will ultimately be calldata on the L1 chain. We believe that cost could be reduced substantially if necessary by clever encoding of data. (In our design any compression / decompression of L2 transaction calldata would be done by client software and L2 programs, never by an L1 contract.)
9. Status of Arbitrum Rollup
Arbitrum Rollup is live on Ethereum testnet. All of the code written to date including everything included in the Reddit demo is open source and permissively licensed under the Apache V2 license. The first testnet version of Arbitrum Rollup was released on testnet in February. Our current internal version, which we used to benchmark the Reddit contracts, will be released soon and will be a major upgrade.
Both the Arbitrum design as well as the implementation are heavily audited by independent third parties. The Arbitrum academic paper was published at USENIX Security, a top-tier peer-reviewed academic venue. For the Arbitrum software, we have engaged Trail of Bits for a security audit, which is currently ongoing, and we are committed to have a clean report before launching on Ethereum mainnet.
10. Reddit Universe Arbitrum Rollup Chain
The benchmarks described in this document were all measured using the latest internal build of our software. When we release the new software upgrade publicly we will launch a Reddit Universe Arbitrum Rollup chain as a public demo, which will contain the Reddit contracts as well as a Uniswap instance and a Connext Hub, demonstrating how Community Points can be integrated into third party apps. We will also allow members of the public to dynamically launch ecosystem contracts. We at Offchain Labs will cover the validating costs for the Reddit Universe public demo.
If the folks at Reddit would like to evaluate our software prior to our public demo, please email us at [email protected] and we'd be more than happy to provide early access.
11. Even more scaling: Arbitrum Sidechains
Rollups are an excellent approach to scaling, and we are excited about Arbitrum Rollup which far surpasses Reddit's scaling needs. But looking forward to Reddit's eventual goal of supporting hundreds of millions of users, there will likely come a time when Reddit needs more scaling than any Rollup protocol can provide.
While Rollups greatly reduce costs, they don't break the linear barrier. That is, all transactions have an on-chain footprint (because all calldata must be posted on-chain), albeit a far smaller one than on native Ethereum, and the L1 limitations end up being the bottleneck for capacity and cost. Since Ethereum has limited capacity, this linear use of on-chain resources means that costs will eventually increase superlinearly with traffic.
The good news is that we at Offchain Labs have a solution in our roadmap that can satisfy this extreme-scaling setting as well: Arbitrum AnyTrust Sidechains. Arbitrum Sidechains are similar to Arbitrum Rollup, but deviate in that they name a permissioned set of validators. When a chain’s validators agree off-chain, they can greatly reduce the on-chain footprint of the protocol and require almost no data to be put on-chain. When validators can't reach unanimous agreement off-chain, the protocol reverts to Arbitrum Rollup. Technically, Arbitrum Sidechains can be viewed as a hybrid between state channels and Rollup, switching back and forth as necessary, and combining the performance and cost that state channels can achieve in the optimistic case, with the robustness of Rollup in other cases. The core technical challenge is how to switch seamlessly between modes and how to guarantee that security is maintained throughout.
Arbitrum Sidechains break through this linear barrier, while still maintaining a high level of security and decentralization. Arbitrum Sidechains provide the AnyTrust guarantee, which says that as long as any one validator is honest and available (even if you don't know which one will be), the L2 chain is guaranteed to execute correctly according to its code and guaranteed to make progress. Unlike in a state channel, offchain progress does not require unanimous consent, and liveness is preserved as long as there is a single honest validator.
Note that the trust model for Arbitrum Sidechains is much stronger than for typical BFT-style chains which introduce a consensus "voting" protocols among a small permissioned group of validators. BFT-based protocols require a supermajority (more than 2/3) of validators to agree. In Arbitrum Sidechains, by contrast, all you need is a single honest validator to achieve guaranteed correctness and progress. Notice that in Arbitrum adding validators strictly increases security since the AnyTrust guarantee provides correctness as long as any one validator is honest and available. By contrast, in BFT-style protocols, adding nodes can be dangerous as a coalition of dishonest nodes can break the protocol.
Like Arbitrum Rollup, the developer and user experiences for Arbitrum Sidechains will be identical to that of Ethereum. Reddit would be able to choose a large and diverse set of validators, and all that they would need to guarantee to break through the scaling barrier is that a single one of them will remain honest.
We hope to have Arbitrum Sidechains in production in early 2021, and thus when Reddit reaches the scale that surpasses the capacity of Rollups, Arbitrum Sidechains will be waiting and ready to help.
While the idea to switch between channels and Rollup to get the best of both worlds is conceptually simple, getting the details right and making sure that the switch does not introduce any attack vectors is highly non-trivial and has been the subject of years of our research (indeed, we were working on this design for years before the term Rollup was even coined).
12. How Arbitrum compares
We include a comparison to several other categories as well as specific projects when appropriate. and explain why we believe that Arbitrum is best suited for Reddit's purposes. We focus our attention on other Ethereum projects.
Payment only Rollups. Compared to Arbitrum Rollup, ZK-Rollups and other Rollups that only support token transfers have several disadvantages:
  • As outlined throughout the proposal, we believe that the entire draw of Ethereum is in its rich smart contracts support which is simply not achievable with today's zero-knowledge proof technology. Indeed, scaling with a ZK-Rollup will add friction to the deployment of smart contracts that interact with Community Points as users will have to withdraw their coins from the ZK-Rollup and transfer them to a smart contract system (like Arbitrum). The community will be best served if Reddit builds on a platform that has built-in, frictionless smart-contract support.
  • All other Rollup protocols of which we are aware employ a centralized operator. While it's true that users retain custody of their coins, the centralized operator can often profit from censoring, reordering, or delaying transactions. A common misconception is that since they're non-custodial protocols, a centralized sequencer does not pose a risk but this is incorrect as the sequencer can wreak havoc or shake down users for side payments without directly stealing funds.
  • Sidechain type protocols can eliminate some of these issues, but they are not trustless. Instead, they require trust in some quorum of a committee, often requiring two-third of the committee to be honest, compared to rollup protocols like Arbitrum that require only a single honest party. In addition, not all sidechain type protocols have committees that are diverse, or even non-centralized, in practice.
  • Plasma-style protocols have a centralized operator and do not support general smart contracts.
13. Concluding Remarks
While it's ultimately up to the judges’ palate, we believe that Arbitrum Rollup is the bakeoff choice that Reddit kneads. We far surpass Reddit's specified workload requirement at present, have much room to optimize Arbitrum Rollup in the near term, and have a clear path to get Reddit to hundreds of millions of users. Furthermore, we are the only project that gives developers and users the identical interface as the Ethereum blockchain and is fully interoperable and tooling-compatible, and we do this all without any new trust assumptions or centralized components.
But no matter how the cookie crumbles, we're glad to have participated in this bake-off and we thank you for your consideration.
About Offchain Labs
Offchain Labs, Inc. is a venture-funded New York company that spun out of Princeton University research, and is building the Arbitrum platform to usher in the next generation of scalable, interoperable, and compatible smart contracts. Offchain Labs is backed by Pantera Capital, Compound VC, Coinbase Ventures, and others.
Leadership Team
Ed Felten
Ed Felten is Co-founder and Chief Scientist at Offchain Labs. He is on leave from Princeton University, where he is the Robert E. Kahn Professor of Computer Science and Public Affairs. From 2015 to 2017 he served at the White House as Deputy United States Chief Technology Officer and senior advisor to the President. He is an ACM Fellow and member of the National Academy of Engineering. Outside of work, he is an avid runner, cook, and L.A. Dodgers fan.
Steven Goldfeder
Steven Goldfeder is Co-founder and Chief Executive Officer at Offchain Labs. He holds a PhD from Princeton University, where he worked at the intersection of cryptography and cryptocurrencies including threshold cryptography, zero-knowledge proof systems, and post-quantum signatures. He is a co-author of Bitcoin and Cryptocurrency Technologies, the leading textbook on cryptocurrencies, and he has previously worked at Google and Microsoft Research, where he co-invented the Picnic signature algorithm. When not working, you can find Steven spending time with his family, taking a nature walk, or twisting balloons.
Harry Kalodner
Harry Kalodner is Co-founder and Chief Technology Officer at Offchain Labs where he leads the engineering team. Before the company he attended Princeton as a Ph.D candidate where his research explored economics, anonymity, and incentive compatibility of cryptocurrencies, and he also has worked at Apple. When not up at 3:00am writing code, Harry occasionally sleeps.
submitted by hkalodner to ethereum [link] [comments]

All you need to know about Yield Farming - The rocket fuel for Defi

All you need to know about Yield Farming - The rocket fuel for Defi
Source
It’s effectively July 2017 in the world of decentralized finance (DeFi), and as in the heady days of the initial coin offering (ICO) boom, the numbers are only trending up.
According to DeFi Pulse, there is $1.9 billion in crypto assets locked in DeFi right now. According to the CoinDesk ICO Tracker, the ICO market started chugging past $1 billion in July 2017, just a few months before token sales started getting talked about on TV.
Debate juxtaposing these numbers if you like, but what no one can question is this: Crypto users are putting more and more value to work in DeFi applications, driven largely by the introduction of a whole new yield-generating pasture, Compound’s COMP governance token.
Governance tokens enable users to vote on the future of decentralized protocols, sure, but they also present fresh ways for DeFi founders to entice assets onto their platforms.
That said, it’s the crypto liquidity providers who are the stars of the present moment. They even have a meme-worthy name: yield farmers.

https://preview.redd.it/lxsvazp1g9l51.png?width=775&format=png&auto=webp&s=a36173ab679c701a5d5e0aac806c00fcc84d78c1

Where it started

Ethereum-based credit market Compound started distributing its governance token, COMP, to the protocol’s users this past June 15. Demand for the token (heightened by the way its automatic distribution was structured) kicked off the present craze and moved Compound into the leading position in DeFi.
The hot new term in crypto is “yield farming,” a shorthand for clever strategies where putting crypto temporarily at the disposal of some startup’s application earns its owner more cryptocurrency.
Another term floating about is “liquidity mining.”
The buzz around these concepts has evolved into a low rumble as more and more people get interested.
The casual crypto observer who only pops into the market when activity heats up might be starting to get faint vibes that something is happening right now. Take our word for it: Yield farming is the source of those vibes.
But if all these terms (“DeFi,” “liquidity mining,” “yield farming”) are so much Greek to you, fear not. We’re here to catch you up. We’ll get into all of them.
We’re going to go from very basic to more advanced, so feel free to skip ahead.

What are tokens?

Most CoinDesk readers probably know this, but just in case: Tokens are like the money video-game players earn while fighting monsters, money they can use to buy gear or weapons in the universe of their favorite game.
But with blockchains, tokens aren’t limited to only one massively multiplayer online money game. They can be earned in one and used in lots of others. They usually represent either ownership in something (like a piece of a Uniswap liquidity pool, which we will get into later) or access to some service. For example, in the Brave browser, ads can only be bought using basic attention token (BAT).
If tokens are worth money, then you can bank with them or at least do things that look very much like banking. Thus: decentralized finance.
Tokens proved to be the big use case for Ethereum, the second-biggest blockchain in the world. The term of art here is “ERC-20 tokens,” which refers to a software standard that allows token creators to write rules for them. Tokens can be used a few ways. Often, they are used as a form of money within a set of applications. So the idea for Kin was to create a token that web users could spend with each other at such tiny amounts that it would almost feel like they weren’t spending anything; that is, money for the internet.
Governance tokens are different. They are not like a token at a video-game arcade, as so many tokens were described in the past. They work more like certificates to serve in an ever-changing legislature in that they give holders the right to vote on changes to a protocol.
So on the platform that proved DeFi could fly, MakerDAO, holders of its governance token, MKR, vote almost every week on small changes to parameters that govern how much it costs to borrow and how much savers earn, and so on.
Read more: Why DeFi’s Billion-Dollar Milestone Matters
One thing all crypto tokens have in common, though, is they are tradable and they have a price. So, if tokens are worth money, then you can bank with them or at least do things that look very much like banking. Thus: decentralized finance.

What is DeFi?

Fair question. For folks who tuned out for a bit in 2018, we used to call this “open finance.” That construction seems to have faded, though, and “DeFi” is the new lingo.
In case that doesn’t jog your memory, DeFi is all the things that let you play with money, and the only identification you need is a crypto wallet.
On the normal web, you can’t buy a blender without giving the site owner enough data to learn your whole life history. In DeFi, you can borrow money without anyone even asking for your name.
I can explain this but nothing really brings it home like trying one of these applications. If you have an Ethereum wallet that has even $20 worth of crypto in it, go do something on one of these products. Pop over to Uniswap and buy yourself some FUN (a token for gambling apps) or WBTC (wrapped bitcoin). Go to MakerDAO and create $5 worth of DAI (a stablecoin that tends to be worth $1) out of the digital ether. Go to Compound and borrow $10 in USDC.
(Notice the very small amounts I’m suggesting. The old crypto saying “don’t put in more than you can afford to lose” goes double for DeFi. This stuff is uber-complex and a lot can go wrong. These may be “savings” products but they’re not for your retirement savings.)
Immature and experimental though it may be, the technology’s implications are staggering. On the normal web, you can’t buy a blender without giving the site owner enough data to learn your whole life history. In DeFi, you can borrow money without anyone even asking for your name.
DeFi applications don’t worry about trusting you because they have the collateral you put up to back your debt (on Compound, for instance, a $10 debt will require around $20 in collateral).
Read more: There Are More DAI on Compound Now Than There Are DAI in the World
If you do take this advice and try something, note that you can swap all these things back as soon as you’ve taken them out. Open the loan and close it 10 minutes later. It’s fine. Fair warning: It might cost you a tiny bit in fees, and the cost of using Ethereum itself right now is much higher than usual, in part due to this fresh new activity. But it’s nothing that should ruin a crypto user.
So what’s the point of borrowing for people who already have the money? Most people do it for some kind of trade. The most obvious example, to short a token (the act of profiting if its price falls). It’s also good for someone who wants to hold onto a token but still play the market.

Doesn’t running a bank take a lot of money up front?

It does, and in DeFi that money is largely provided by strangers on the internet. That’s why the startups behind these decentralized banking applications come up with clever ways to attract HODLers with idle assets.
Liquidity is the chief concern of all these different products. That is: How much money do they have locked in their smart contracts?
“In some types of products, the product experience gets much better if you have liquidity. Instead of borrowing from VCs or debt investors, you borrow from your users,” said Electric Capital managing partner Avichal Garg.
Let’s take Uniswap as an example. Uniswap is an “automated market maker,” or AMM (another DeFi term of art). This means Uniswap is a robot on the internet that is always willing to buy and it’s also always willing to sell any cryptocurrency for which it has a market.
On Uniswap, there is at least one market pair for almost any token on Ethereum. Behind the scenes, this means Uniswap can make it look like it is making a direct trade for any two tokens, which makes it easy for users, but it’s all built around pools of two tokens. And all these market pairs work better with bigger pools.

Why do I keep hearing about ‘pools’?

To illustrate why more money helps, let’s break down how Uniswap works.
Let’s say there was a market for USDC and DAI. These are two tokens (both stablecoins but with different mechanisms for retaining their value) that are meant to be worth $1 each all the time, and that generally tends to be true for both.
The price Uniswap shows for each token in any pooled market pair is based on the balance of each in the pool. So, simplifying this a lot for illustration’s sake, if someone were to set up a USDC/DAI pool, they should deposit equal amounts of both. In a pool with only 2 USDC and 2 DAI it would offer a price of 1 USDC for 1 DAI. But then imagine that someone put in 1 DAI and took out 1 USDC. Then the pool would have 1 USDC and 3 DAI. The pool would be very out of whack. A savvy investor could make an easy $0.50 profit by putting in 1 USDC and receiving 1.5 DAI. That’s a 50% arbitrage profit, and that’s the problem with limited liquidity.
(Incidentally, this is why Uniswap’s prices tend to be accurate, because traders watch it for small discrepancies from the wider market and trade them away for arbitrage profits very quickly.)
Read more: Uniswap V2 Launches With More Token-Swap Pairs, Oracle Service, Flash Loans
However, if there were 500,000 USDC and 500,000 DAI in the pool, a trade of 1 DAI for 1 USDC would have a negligible impact on the relative price. That’s why liquidity is helpful.
You can stick your assets on Compound and earn a little yield. But that’s not very creative. Users who look for angles to maximize that yield: those are the yield farmers.
Similar effects hold across DeFi, so markets want more liquidity. Uniswap solves this by charging a tiny fee on every trade. It does this by shaving off a little bit from each trade and leaving that in the pool (so one DAI would actually trade for 0.997 USDC, after the fee, growing the overall pool by 0.003 USDC). This benefits liquidity providers because when someone puts liquidity in the pool they own a share of the pool. If there has been lots of trading in that pool, it has earned a lot of fees, and the value of each share will grow.
And this brings us back to tokens.
Liquidity added to Uniswap is represented by a token, not an account. So there’s no ledger saying, “Bob owns 0.000000678% of the DAI/USDC pool.” Bob just has a token in his wallet. And Bob doesn’t have to keep that token. He could sell it. Or use it in another product. We’ll circle back to this, but it helps to explain why people like to talk about DeFi products as “money Legos.”

So how much money do people make by putting money into these products?

It can be a lot more lucrative than putting money in a traditional bank, and that’s before startups started handing out governance tokens.
Compound is the current darling of this space, so let’s use it as an illustration. As of this writing, a person can put USDC into Compound and earn 2.72% on it. They can put tether (USDT) into it and earn 2.11%. Most U.S. bank accounts earn less than 0.1% these days, which is close enough to nothing.
However, there are some caveats. First, there’s a reason the interest rates are so much juicier: DeFi is a far riskier place to park your money. There’s no Federal Deposit Insurance Corporation (FDIC) protecting these funds. If there were a run on Compound, users could find themselves unable to withdraw their funds when they wanted.
Plus, the interest is quite variable. You don’t know what you’ll earn over the course of a year. USDC’s rate is high right now. It was low last week. Usually, it hovers somewhere in the 1% range.
Similarly, a user might get tempted by assets with more lucrative yields like USDT, which typically has a much higher interest rate than USDC. (Monday morning, the reverse was true, for unclear reasons; this is crypto, remember.) The trade-off here is USDT’s transparency about the real-world dollars it’s supposed to hold in a real-world bank is not nearly up to par with USDC’s. A difference in interest rates is often the market’s way of telling you the one instrument is viewed as dicier than another.
Users making big bets on these products turn to companies Opyn and Nexus Mutual to insure their positions because there’s no government protections in this nascent space – more on the ample risks later on.
So users can stick their assets in Compound or Uniswap and earn a little yield. But that’s not very creative. Users who look for angles to maximize that yield: those are the yield farmers.

OK, I already knew all of that. What is yield farming?

Broadly, yield farming is any effort to put crypto assets to work and generate the most returns possible on those assets.
At the simplest level, a yield farmer might move assets around within Compound, constantly chasing whichever pool is offering the best APY from week to week. This might mean moving into riskier pools from time to time, but a yield farmer can handle risk.
“Farming opens up new price arbs [arbitrage] that can spill over to other protocols whose tokens are in the pool,” said Maya Zehavi, a blockchain consultant.
Because these positions are tokenized, though, they can go further.
This was a brand-new kind of yield on a deposit. In fact, it was a way to earn a yield on a loan. Who has ever heard of a borrower earning a return on a debt from their lender?
In a simple example, a yield farmer might put 100,000 USDT into Compound. They will get a token back for that stake, called cUSDT. Let’s say they get 100,000 cUSDT back (the formula on Compound is crazy so it’s not 1:1 like that but it doesn’t matter for our purposes here).
They can then take that cUSDT and put it into a liquidity pool that takes cUSDT on Balancer, an AMM that allows users to set up self-rebalancing crypto index funds. In normal times, this could earn a small amount more in transaction fees. This is the basic idea of yield farming. The user looks for edge cases in the system to eke out as much yield as they can across as many products as it will work on.
Right now, however, things are not normal, and they probably won’t be for a while.

Why is yield farming so hot right now?

Because of liquidity mining. Liquidity mining supercharges yield farming.
Liquidity mining is when a yield farmer gets a new token as well as the usual return (that’s the “mining” part) in exchange for the farmer’s liquidity.
“The idea is that stimulating usage of the platform increases the value of the token, thereby creating a positive usage loop to attract users,” said Richard Ma of smart-contract auditor Quantstamp.
The yield farming examples above are only farming yield off the normal operations of different platforms. Supply liquidity to Compound or Uniswap and get a little cut of the business that runs over the protocols – very vanilla.
But Compound announced earlier this year it wanted to truly decentralize the product and it wanted to give a good amount of ownership to the people who made it popular by using it. That ownership would take the form of the COMP token.
Lest this sound too altruistic, keep in mind that the people who created it (the team and the investors) owned more than half of the equity. By giving away a healthy proportion to users, that was very likely to make it a much more popular place for lending. In turn, that would make everyone’s stake worth much more.
So, Compound announced this four-year period where the protocol would give out COMP tokens to users, a fixed amount every day until it was gone. These COMP tokens control the protocol, just as shareholders ultimately control publicly traded companies.
Every day, the Compound protocol looks at everyone who had lent money to the application and who had borrowed from it and gives them COMP proportional to their share of the day’s total business.
The results were very surprising, even to Compound’s biggest promoters.
COMP’s value will likely go down, and that’s why some investors are rushing to earn as much of it as they can right now.
This was a brand-new kind of yield on a deposit into Compound. In fact, it was a way to earn a yield on a loan, as well, which is very weird: Who has ever heard of a borrower earning a return on a debt from their lender?
COMP’s value has consistently been well over $200 since it started distributing on June 15. We did the math elsewhere but long story short: investors with fairly deep pockets can make a strong gain maximizing their daily returns in COMP. It is, in a way, free money.
It’s possible to lend to Compound, borrow from it, deposit what you borrowed and so on. This can be done multiple times and DeFi startup Instadapp even built a tool to make it as capital-efficient as possible.
“Yield farmers are extremely creative. They find ways to ‘stack’ yields and even earn multiple governance tokens at once,” said Spencer Noon of DTC Capital.
COMP’s value spike is a temporary situation. The COMP distribution will only last four years and then there won’t be any more. Further, most people agree that the high price now is driven by the low float (that is, how much COMP is actually free to trade on the market – it will never be this low again). So the value will probably gradually go down, and that’s why savvy investors are trying to earn as much as they can now.
Appealing to the speculative instincts of diehard crypto traders has proven to be a great way to increase liquidity on Compound. This fattens some pockets but also improves the user experience for all kinds of Compound users, including those who would use it whether they were going to earn COMP or not.
As usual in crypto, when entrepreneurs see something successful, they imitate it. Balancer was the next protocol to start distributing a governance token, BAL, to liquidity providers. Flash loan provider bZx has announced a plan. Ren, Curve and Synthetix also teamed up to promote a liquidity pool on Curve.
It is a fair bet many of the more well-known DeFi projects will announce some kind of coin that can be mined by providing liquidity.
The case to watch here is Uniswap versus Balancer. Balancer can do the same thing Uniswap does, but most users who want to do a quick token trade through their wallet use Uniswap. It will be interesting to see if Balancer’s BAL token convinces Uniswap’s liquidity providers to defect.
So far, though, more liquidity has gone into Uniswap since the BAL announcement, according to its data site. That said, even more has gone into Balancer.

Did liquidity mining start with COMP?

No, but it was the most-used protocol with the most carefully designed liquidity mining scheme.
This point is debated but the origins of liquidity mining probably date back to Fcoin, a Chinese exchange that created a token in 2018 that rewarded people for making trades. You won’t believe what happened next! Just kidding, you will: People just started running bots to do pointless trades with themselves to earn the token.
Similarly, EOS is a blockchain where transactions are basically free, but since nothing is really free the absence of friction was an invitation for spam. Some malicious hacker who didn’t like EOS created a token called EIDOS on the network in late 2019. It rewarded people for tons of pointless transactions and somehow got an exchange listing.
These initiatives illustrated how quickly crypto users respond to incentives.
Read more: Compound Changes COMP Distribution Rules Following ‘Yield Farming’ Frenzy
Fcoin aside, liquidity mining as we now know it first showed up on Ethereum when the marketplace for synthetic tokens, Synthetix, announced in July 2019 an award in its SNX token for users who helped add liquidity to the sETH/ETH pool on Uniswap. By October, that was one of Uniswap’s biggest pools.
When Compound Labs, the company that launched the Compound protocol, decided to create COMP, the governance token, the firm took months designing just what kind of behavior it wanted and how to incentivize it. Even still, Compound Labs was surprised by the response. It led to unintended consequences such as crowding into a previously unpopular market (lending and borrowing BAT) in order to mine as much COMP as possible.
Just last week, 115 different COMP wallet addresses – senators in Compound’s ever-changing legislature – voted to change the distribution mechanism in hopes of spreading liquidity out across the markets again.

Is there DeFi for bitcoin?

Yes, on Ethereum.
Nothing has beaten bitcoin over time for returns, but there’s one thing bitcoin can’t do on its own: create more bitcoin.
A smart trader can get in and out of bitcoin and dollars in a way that will earn them more bitcoin, but this is tedious and risky. It takes a certain kind of person.
DeFi, however, offers ways to grow one’s bitcoin holdings – though somewhat indirectly.
A long HODLer is happy to gain fresh BTC off their counterparty’s short-term win. That’s the game.
For example, a user can create a simulated bitcoin on Ethereum using BitGo’s WBTC system. They put BTC in and get the same amount back out in freshly minted WBTC. WBTC can be traded back for BTC at any time, so it tends to be worth the same as BTC.
Then the user can take that WBTC, stake it on Compound and earn a few percent each year in yield on their BTC. Odds are, the people who borrow that WBTC are probably doing it to short BTC (that is, they will sell it immediately, buy it back when the price goes down, close the loan and keep the difference).
A long HODLer is happy to gain fresh BTC off their counterparty’s short-term win. That’s the game.

How risky is it?

Enough.
“DeFi, with the combination of an assortment of digital funds, automation of key processes, and more complex incentive structures that work across protocols – each with their own rapidly changing tech and governance practices – make for new types of security risks,” said Liz Steininger of Least Authority, a crypto security auditor. “Yet, despite these risks, the high yields are undeniably attractive to draw more users.”
We’ve seen big failures in DeFi products. MakerDAO had one so bad this year it’s called “Black Thursday.” There was also the exploit against flash loan provider bZx. These things do break and when they do money gets taken.
As this sector gets more robust, we could see token holders greenlighting more ways for investors to profit from DeFi niches.
Right now, the deal is too good for certain funds to resist, so they are moving a lot of money into these protocols to liquidity mine all the new governance tokens they can. But the funds – entities that pool the resources of typically well-to-do crypto investors – are also hedging. Nexus Mutual, a DeFi insurance provider of sorts, told CoinDesk it has maxed out its available coverage on these liquidity applications. Opyn, the trustless derivatives maker, created a way to short COMP, just in case this game comes to naught.
And weird things have arisen. For example, there’s currently more DAI on Compound than have been minted in the world. This makes sense once unpacked but it still feels dicey to everyone.
That said, distributing governance tokens might make things a lot less risky for startups, at least with regard to the money cops.
“Protocols distributing their tokens to the public, meaning that there’s a new secondary listing for SAFT tokens, [gives] plausible deniability from any security accusation,” Zehavi wrote. (The Simple Agreement for Future Tokens was a legal structure favored by many token issuers during the ICO craze.)
Whether a cryptocurrency is adequately decentralized has been a key feature of ICO settlements with the U.S. Securities and Exchange Commission (SEC).

What’s next for yield farming? (A prediction)

COMP turned out to be a bit of a surprise to the DeFi world, in technical ways and others. It has inspired a wave of new thinking.
“Other projects are working on similar things,” said Nexus Mutual founder Hugh Karp. In fact, informed sources tell CoinDesk brand-new projects will launch with these models.
We might soon see more prosaic yield farming applications. For example, forms of profit-sharing that reward certain kinds of behavior.
Imagine if COMP holders decided, for example, that the protocol needed more people to put money in and leave it there longer. The community could create a proposal that shaved off a little of each token’s yield and paid that portion out only to the tokens that were older than six months. It probably wouldn’t be much, but an investor with the right time horizon and risk profile might take it into consideration before making a withdrawal.
(There are precedents for this in traditional finance: A 10-year Treasury bond normally yields more than a one-month T-bill even though they’re both backed by the full faith and credit of Uncle Sam, a 12-month certificate of deposit pays higher interest than a checking account at the same bank, and so on.)
As this sector gets more robust, its architects will come up with ever more robust ways to optimize liquidity incentives in increasingly refined ways. We could see token holders greenlighting more ways for investors to profit from DeFi niches.
Questions abound for this nascent industry: What will MakerDAO do to restore its spot as the king of DeFi? Will Uniswap join the liquidity mining trend? Will anyone stick all these governance tokens into a decentralized autonomous organization (DAO)? Or would that be a yield farmers co-op?
Whatever happens, crypto’s yield farmers will keep moving fast. Some fresh fields may open and some may soon bear much less luscious fruit.
But that’s the nice thing about farming in DeFi: It is very easy to switch fields.
submitted by pascalbernoulli to Yield_Farming [link] [comments]

07.03.1 puts, printf, Zeichenketten ausgeben HIGHLIGHTS: Libra's Senate Hearing. Facebook Sued for $500 Billion! Bitcoin & Crypto News The Onslaught On Whites Intensifies - South Africa Bitcoin: Beyond The Bubble - Full Documentary - YouTube How to Print SSS PRN Payment Reference Number - YouTube

This is another API by BraveNewCoin that provides reference rates for Bitcoin, Ethereum, Litecoin and many others for over 3,000 cryptocurrency markets.. Wallet Integration. This API has no wallet integration. Transaction Support. This API has no transaction support. Price. The Intraday Reference Rates API is free to use up to 600 API requests per month. Google "bitcoin payment notification" and subscribe to at least one bitcoin payment notification service. There are various services that will notify you via Web Services, API, SMS, Email, etc. Once you receive this notification, which could be programmatically automated, you can process the customer's order. To manually check if a payment has arrived you can use Block Explorer. Replace ... Blockchain.com Exchange APIs Build bitcoin apps on top of Blockchain.com Exchange API for free. Request an Exchange API Key Login. Blockchain.com Exchange APIs. Websockets Leverage the Websocket API to receive market data and to interact with the trading system in real time. Every message comes in JSON format and trading messages use the FIX standard for naming fields, and message types. View ... When parsing nBits, Bitcoin Core converts a negative target threshold into a target of zero, which the header hash can equal (in theory, at least). When creating a value for nBits, Bitcoin Core checks to see if it will produce an nBits which will be interpreted as negative; if so, it divides the significand by 256 and increases the exponent by 1 to produce the same number with a different ... CoinDesk provides a simple and free API to make its Bitcoin Price Index (BPI) data programmatically available to others. Find out how to use it here.

[index] [40690] [3033] [18730] [24552] [2982] [44603] [7811] [16936] [4798] [39442]

07.03.1 puts, printf, Zeichenketten ausgeben

Receive SMS online. Real mobile numbers which can receive any sent SMS. How to complete SMS verification with virtual mobile number. Free demo number: http:/... There's a hidden mathematical pattern in your credit card (also works for debit cards or ATM cards). In this video I will teach you a mathematical magic tric... Working with numbers in JavaScript including adding, subtracting, multiplying, dividing, modulus, increment, decrement, and compound assignment. Beau Carnes ... Israelis have sued Facebook for $500 billion! Mattie will also talk about the highlights from Facebook’s Libra Senate hearing as well as the continued hate towards the crypto. He will also touch ... To contribute to my channel locally, a deposit into any of these accounts can be made from where it will be sent directly to my Zuidland Payapp account: Please make sure that the reference number ...

#