IOTA Releases Version 0.1.0 of Its Coordicide Alphanet ...
IOTA Releases Version 0.1.0 of Its Coordicide Alphanet ...
A Coo-less Testnet - The Bitcoin News
Wasabi Wallet Upgrades Its Bitcoin Mixing Functionality ...
IOTA Releases Pollen, The First Phase To IOTA 2.0: Report
Syscoin Platform’s Great Reddit Scaling Bake-off Proposal
https://preview.redd.it/rqt2dldyg8e51.jpg?width=1044&format=pjpg&auto=webp&s=777ae9d4fbbb54c3540682b72700fc4ba3de0a44 We are excited to participate and present Syscoin Platform's ideal characteristics and capabilities towards a well-rounded Reddit Community Points solution! Our scaling solution for Reddit Community Points involves 2-way peg interoperability with Ethereum. This will provide a scalable token layer built specifically for speed and high volumes of simple value transfers at a very low cost, while providing sovereign ownership and onchain finality. Token transfers scale by taking advantage of a globally sorting mempool that provides for probabilistically secure assumptions of “as good as settled”. The opportunity here for token receivers is to have an app-layer interactivity on the speed/security tradeoff (99.9999% assurance within 10 seconds). We call this Z-DAG, and it achieves high-throughput across a mesh network topology presently composed of about 2,000 geographically dispersed full-nodes. Similar to Bitcoin, however, these nodes are incentivized to run full-nodes for the benefit of network security, through a bonded validator scheme. These nodes do not participate in the consensus of transactions or block validation any differently than other nodes and therefore do not degrade the security model of Bitcoin’s validate first then trust, across every node. Each token transfer settles on-chain. The protocol follows Bitcoin core policies so it has adequate code coverage and protocol hardening to be qualified as production quality software. It shares a significant portion of Bitcoin’s own hashpower through merged-mining. This platform as a whole can serve token microtransactions, larger settlements, and store-of-value in an ideal fashion, providing probabilistic scalability whilst remaining decentralized according to Bitcoin design. It is accessible to ERC-20 via a permissionless and trust-minimized bridge that works in both directions. The bridge and token platform are currently available on the Syscoin mainnet. This has been gaining recent attention for use by loyalty point programs and stablecoins such as Binance USD.
Syscoin Foundation identified a few paths for Reddit to leverage this infrastructure, each with trade-offs. The first provides the most cost-savings and scaling benefits at some sacrifice of token autonomy. The second offers more preservation of autonomy with a more narrow scope of cost savings than the first option, but savings even so. The third introduces more complexity than the previous two yet provides the most overall benefits. We consider the third as most viable as it enables Reddit to benefit even while retaining existing smart contract functionality. We will focus on the third option, and include the first two for good measure.
Distribution, burns and user-to-user transfers of Reddit Points are entirely carried out on the Syscoin network. This full-on approach to utilizing the Syscoin network provides the most scalability and transaction cost benefits of these scenarios. The tradeoff here is distribution and subscription handling likely migrating away from smart contracts into the application layer.
The Reddit Community Points ecosystem can continue to use existing smart contracts as they are used today on the Ethereum mainchain. Users migrate a portion of their tokens to Syscoin, the scaling network, to gain much lower fees, scalability, and a proven base layer, without sacrificing sovereign ownership. They would use Syscoin for user-to-user transfers. Tips redeemable in ten seconds or less, a high-throughput relay network, and onchain settlement at a block target of 60 seconds.
Integration between Matic Network and Syscoin Platform - similar to Syscoin’s current integration with Ethereum - will provide Reddit Community Points with EVM scalability (including the Memberships ERC777 operator) on the Matic side, and performant simple value transfers, robust decentralized security, and sovereign store-of-value on the Syscoin side. It’s “the best of both worlds”. The trade-off is more complex interoperability.
Syscoin + Matic Integration
Matic and Blockchain Foundry Inc, the public company formed by the founders of Syscoin, recently entered a partnership for joint research and business development initiatives. This is ideal for all parties as Matic Network and Syscoin Platform provide complementary utility. Syscoin offers characteristics for sovereign ownership and security based on Bitcoin’s time-tested model, and shares a significant portion of Bitcoin’s own hashpower. Syscoin’s focus is on secure and scalable simple value transfers, trust-minimized interoperability, and opt-in regulatory compliance for tokenized assets rather than scalability for smart contract execution. On the other hand, Matic Network can provide scalable EVM for smart contract execution. Reddit Community Points can benefit from both. Syscoin + Matic integration is actively being explored by both teams, as it is helpful to Reddit, Ethereum, and the industry as a whole.
Total cost for these 100k transactions: $0.63 USD See the live fee comparison for savings estimation between transactions on Ethereum and Syscoin. Below is a snapshot at time of writing: ETH price: $318.55 ETH gas price: 55.00 Gwei ($0.37) Syscoin price: $0.11 Snapshot of live fee comparison chart Z-DAG provides a more efficient fee-market. A typical Z-DAG transaction costs 0.0000582 SYS. Tokens can be safely redeemed/re-spent within seconds or allowed to settle on-chain beforehand. The costs should remain about this low for microtransactions. Syscoin will achieve further reduction of fees and even greater scalability with offchain payment channels for assets, with Z-DAG as a resilience fallback. New payment channel technology is one of the topics under research by the Syscoin development team with our academic partners at TU Delft. In line with the calculation in the Lightning Networks white paper, payment channels using assets with Syscoin Core will bring theoretical capacity for each person on Earth (7.8 billion) to have five on-chain transactions per year, per person, without requiring anyone to enter a fee market (aka “wait for a block”). This exceeds the minimum LN expectation of two transactions per person, per year; one to exist on-chain and one to settle aggregated value.
Tools to simplify using Syscoin Bridge as a service with dapps and wallets will be released some time after implementation of Syscoin Core 4.2. These will be based upon the same processes which are automated in the current live Sysethereum Dapp that is functioning with the Syscoin mainnet.
The Syscoin Ethereum Bridge is secured by Agent nodes participating in a decentralized and incentivized model that involves roles of Superblock challengers and submitters. This model is open to participation. The benefits here are trust-minimization, permissionless-ness, and potentially less legal/regulatory red-tape than interop mechanisms that involve liquidity providers and/or trading mechanisms. The trade-off is that due to the decentralized nature there are cross-chain settlement times of one hour to cross from Ethereum to Syscoin, and three hours to cross from Syscoin to Ethereum. We are exploring ways to reduce this time while maintaining decentralization via zkp. Even so, an “instant bridge” experience could be provided by means of a third-party liquidity mechanism. That option exists but is not required for bridge functionality today. Typically bridges are used with batch value, not with high frequencies of smaller values, and generally it is advantageous to keep some value on both chains for maximum availability of utility. Even so, the cross-chain settlement time is good to mention here.
Ethereum -> Syscoin: Matic or Ethereum transaction fee for bridge contract interaction, negligible Syscoin transaction fee for minting tokens Syscoin -> Ethereum: Negligible Syscoin transaction fee for burning tokens, 0.01% transaction fee paid to Bridge Agent in the form of the ERC-20, Matic or Ethereum transaction fee for contract interaction.
Zero-Confirmation Directed Acyclic Graph is an instant settlement protocol that is used as a complementary system to proof-of-work (PoW) in the confirmation of Syscoin service transactions. In essence, a Z-DAG is simply a directed acyclic graph (DAG) where validating nodes verify the sequential ordering of transactions that are received in their memory pools. Z-DAG is used by the validating nodes across the network to ensure that there is absolute consensus on the ordering of transactions and no balances are overflowed (no double-spends).
Unique fee-market that is more efficient for microtransaction redemption and settlement
Uses decentralized means to enable tokens with value transfer scalability that is comparable or exceeds that of credit card networks
Provides high throughput and secure fulfillment even if blocks are full
Probabilistic and interactive
99.9999% security assurance within 10 seconds
Can serve payment channels as a resilience fallback that is faster and lower-cost than falling-back directly to a blockchain
Each Z-DAG transaction also settles onchain through Syscoin Core at 60-second block target using SHA-256 Proof of Work consensus
Z-DAG enables the ideal speed/security tradeoff to be determined per use-case in the application layer. It minimizes the sacrifice required to accept and redeem fast transfers/payments while providing more-than-ample security for microtransactions. This is supported on the premise that a Reddit user receiving points does need security yet generally doesn’t want nor need to wait for the same level of security as a nation-state settling an international trade debt. In any case, each Z-DAG transaction settles onchain at a block target of 60 seconds.
Syscoin 3.0 White Paper (4.0 white paper is pending. For improved scalability and less blockchain bloat, some features of v3 no longer exist in current v4: Specifically Marketplace Offers, Aliases, Escrow, Certificates, Pruning, Encrypted Messaging)
16MB block bandwidth per minute assuming segwit witness carrying transactions, and transactions ~200 bytes on average
SHA256 merge mined with Bitcoin
UTXO asset layer, with base Syscoin layer sharing identical security policies as Bitcoin Core
Z-DAG on asset layer, bridge to Ethereum on asset layer
On-chain scaling with prospect of enabling enterprise grade reliable trustless payment processing with on/offchain hybrid solution
Focus only on Simple Value Transfers. MVP of blockchain consensus footprint is balances and ownership of them. Everything else can reduce data availability in exchange for scale (Ethereum 2.0 model). We leave that to other designs, we focus on transfers.
Future integrations of MAST/Taproot to get more complex value transfers without trading off trustlessness or decentralization.
Zero-knowledge Proofs are a cryptographic new frontier. We are dabbling here to generalize the concept of bridging and also verify the state of a chain efficiently. We also apply it in our Digital Identity projects at Blockchain Foundry (a publicly traded company which develops Syscoin softwares for clients). We are also looking to integrate privacy preserving payment channels for off-chain payments through zkSNARK hub & spoke design which does not suffer from the HTLC attack vectors evident on LN. Much of the issues plaguing Lightning Network can be resolved using a zkSNARK design whilst also providing the ability to do a multi-asset payment channel system. Currently we found a showstopper attack (American Call Option) on LN if we were to use multiple-assets. This would not exist in a system such as this.
Web3 and mobile wallets are under active development by Blockchain Foundry Inc as WebAssembly applications and expected for release not long after mainnet deployment of Syscoin Core 4.2. Both of these will be multi-coin wallets that support Syscoin, SPTs, Ethereum, and ERC-20 tokens. The Web3 wallet will provide functionality similar to Metamask. Syscoin Platform and tokens are already integrated with Blockbook. Custom hardware wallet support currently exists via ElectrumSys. First-class HW wallet integration through apps such as Ledger Live will exist after 4.2. Current supported wallets Syscoin Spark Desktop Syscoin-Qt
IOTA: Centralization, Coordinator, and whether it'll work without the Coordinator
Many people appear to be confused on what IOTA’s coordinator does. Some people argue that IOTA is centralized due to the existence of the coordinator and accordingly argues that IOTA is centralized. Further some people also argue that IOTA has never proven that it works without the coordinator. I elaborate on these points below. Milestone/Checkpoint First, IOTA’s coordinator only issues a normal signed transaction commonly known to as a “milestone.” Currently, so long as another transaction is referenced by this milestone, the transaction is confirmed. The reason and purpose of the issuance of these milestones are simple. It is to protect the network (the “Tangle”) at this early stage against hostile attacks, and defends against double spends or spending from non-existent funds. It is important to note at this stage that many cryptos have also issued these types of milestones. For example, Satoshi himself implemented “checkpoints” in order to protect BTC’s early blockchain in 2010 until it was removed approximately four years later in 2014. Satoshi stated on July 17, 2010 at bitcointak.org:
- Added a simple security safeguard that locks-in the block chain up to this point. The security safeguard makes it so even if someone does have more than 50% of the network's CPU power, they can't try to go back and redo the block chain before yesterday.(if you have this update) I'll probably put a checkpoint in each version from now on.Once the software has settled what the widely accepted block chain is, there's no point in leaving open the unwanted non-zero possibility of revision months later.
Litecoin (LTC) also used checkpoints and recently on November 15, 2018, even Bitcoin Cash ABC implemented checkpoints to protect the network. In quick summary checkpoints, or for the purposes of IOTA, “milestones” are nothing new in crypto and are often implemented in order to secure a new and young network. Keep in mind that BTC removed checkpoints in 2014, which is approximately four years after inception. IOTA is much younger. IOTA works without the coordinator Second, and this is of key importance, IOTA already works without the coordinator. This is akin to Bitcoin working without checkpoints. The purpose of the checkpoints was to prevent attacks because of Bitcoin’s young age and susceptibility. Similarly, IOTA also works now without the coordinator. However, the issue is that without the coordinator, it is open to attacks due to its early age and susceptibility similar to Bitcoin. Now, some people may argue that if IOTA cannot defend against attacks then it “does not work.” If that is one’s definition of whether cryptocurrency works, then many cryptocurrencies have not worked for several years, including BTC until 2014, and one can also argue that Bitcoin Cash ABC does not work currently because it still utilizes checkpoints. Either opinion is valid, however, one cannot argue that IOTA is centralized and/or does not work without accepting the fact that BTC was also centralized and/or did not work for four years, and Bitcoin Cash ABC is also centralized and/or does not work right now because of checkpoints. Notwithstanding, IOTA, through its foundation, is researching multiple ways to get rid of the coordinator while also securing the network. Again, this is different than ensuring that IOTA works (it already does), as the IOTA Foundation is looking for ways to secure the network against attacks without the coordinator. IOTA is very young in comparison to many cryptocurrencies and the Foundation is assuming that massive amounts of resources may be used against IOTA to attack it, if the coordinator is removed. Therefore, the Foundation is working on multiple ways to secure the IOTA network. IOTA works better without coordinator Finally, while most FUDsters are happy to point out the fact that IOTA utilizes a coordinator and therefore, it “does not work,” they forget to mention that the coordinator is a limiting factor in terms of scalability of the network. Since every transaction must reference a milestone and accordingly, are dependent on said milestones, confirmations are often slowed down because of it. As a matter of fact, transactions will speed up and the network will scale much higher without the use of the coordinator. Overtime, not only will IOTA implement some of its already published methods to secure the network, IOTA will also be further secured by higher transactions per second which an attacker must overcome. To the fans: If you believe this is exciting, remember we have barely scratched the surface with Qubic still in development. TLDR Version: The coordinator only exists to secure against attacks. Further, IOTA works without the coordinator, but it would then be susceptible to attacks. Other cryptocurrencies including BTC and LTC have used something similar to a coordinator by use of “checkpoints” and Bitcoin Cash still utilizes it currently. Finally, IOTA will get faster and scale better once the coordinator is removed. edit: typos
Numbers on the screen or how digital payment systems make the market fair?
https://preview.redd.it/6m1gp2mvotx41.png?width=1160&format=png&auto=webp&s=a83f0346d8008c17968d6a240cfba8fe3fe4e2aa Continuing the trend of practicality characteristic of the XXI century, paper money is gradually disappearing from our lives, giving way to more practical digital storage. However, the digitized banking that we now use every day is still far from perfect. For starters, it is completely controlled by third parties. No one owns the numbers they see on the screen — control is entirely owned by third parties, such as banks. Banks create money out of thin air, and credit is a prime example of this. Money is no longer printed when someone takes out an overdraft or mortgage-it is simply created out of nothing. Moreover, these banks charge disproportionately high fees for the services they provide, and these services are outdated and impractical today. For example, it is impractical to pay a Commission to spend your money abroad, as it is impractical to wait a few days to verify the transfer of a small amount from You to your relative. All this makes no sense in the interconnected and instantaneous world in which We live today. Thus, the monetary system has ceased to be practical, it is replaced by a higher form of value storage. In this particular case, it is replaced by a faster and safer system that eliminates expensive operations and gives control to the person. https://i.redd.it/quc2bgmxotx41.gif Money that you have in your Bank account can be considered a virtual currency since it does not have a physical form and exists only in the Bank book. If they lose the book, your money will simply disappear. These are just numbers that you see on the screen. The numbers are stored on the hard drives of Bank servers. https://i.redd.it/4nvhydtzotx41.gif Do you open a regular app and think you have money? They are just bytes of the computer system. Today’s global payment infrastructure moves money from one payment system to another through a series of internal Deposit transfers between financial institutions. Since these transfers occur in different systems with a low level of coordination, the calculation of funds is slow, often 3–5 days, capturing liquidity.
How do payments work?
When you make a money transfer, for example, from your Bank card to the Bank card of a friend or acquaintance, you see an instant transfer, so to speak, moving numbers from you to the Recipient. For the user, the transfer is carried out instantly, and the exchange of obligations between the participants of the process takes place within 3–7 days, the User does not know about it and hardly ever thinks about it. https://preview.redd.it/rl4aai81ptx41.png?width=1400&format=png&auto=webp&s=cdfa10f0d68442aac84c14e3bae5a52e92651878 When you make a payment at a supermarket or any other point of sale, at the time of payment, information from the POS-terminal is sent to the acquiring Bank — then the acquiring Bank sends a request that passes through the payment system (Visa or MasterCard) and then transmitted to Your Bank, which confirms the operation. At this point, there is no write-off of funds. The funds are temporarily held, and the actual withdrawal will take place within a few days, the maximum processing time is up to 30 days. https://i.redd.it/8njxgxq2ptx41.gif
Currency transactions and payments abroad
You may have noticed that after making a transaction in a different currency, such as yen or dirhams, or any other currency that differs from the currency of your account or buying an item abroad, the amount charged may differ from the amount that was reflected immediately after payment.
Why is this happening?
As soon As you have made a transaction with Your Bank card — the local Bank transfers the information to the payment system: Visa or MasterCard — the payment system converts the currency used into the billing currency.
Billing currency — the currency that will be used for payment with the payment system by your Bank that issued the card. For the US, the billing currency is the dollar, in Europe — the Euro.
The billing currency may also differ depending on the issuing Bank — the Bank that issued your debit card. For example, some banks use the billing currency — Euro when making payments with MasterCard cards in the United States, which will lead to additional costs when converting euros into dollars. If the payment is in other currencies, the payment scheme will become more complicated and, accordingly, its cost will be more expensive. The transfer rate from one settlement currency to another is set by the payment system: Visa and MasterCard. If the currency of your Bankcard is the same as the currency of the payment system, the payment will take place without additional operations. For example, You have a dollar card, you make a payment in dollars in the United States, and if you make a payment with a dollar card in Europe, your Bank will convert the amount at its exchange rate, which will lead to additional costs. There are exceptions, some European banks can use dollars for settlements, but this is more an exception than a rule. Also, if, for example, you pay for purchases in China using a Bank card in euros, then double conversion is inevitable. Thus, payment in dollars is universal all over the world, except for the European Union countries. The dollar is a global currency and is therefore often used for binding in international settlements. Now we understand that due to differences in the account currency and the differences in the VISA or MasterCard payment system, additional conversions may occur, which will lead to additional bank fees. as a result, the actual payment amount will differ from the amount debited from your card. In addition to paying for conversion in the payment system and paying for currency conversion in your Bank, some banks charge an additional fee for conducting a cross-border transaction.
Where do we lose money when making debit card payments?
Euro-Dollar, or in the case of processing payment via MasterCard in Turkey, Turkish lira-Euro and additional conversion on the side of the issuing Bank (your Bank) Euro — Dollar.
Currency conversion by an acquiring bank;
The difference between the exchange rate on the purchase date and the write-off date. We purchased at a rate of 0.91 euros per dollar, and the write-off occurred at a rate of 0.94 euros per dollar.
A large number of currency conversions.
The greater the number of them, the more we will lose when buying. For example, when paying in the UAE or China, buying a product for the local currency, we understand that the number of conversions increases several times.
If we touch on the topic of international translations, we will encounter additional nuances:
This is the payment processing time. International payments can be processed within 3–5 days, as mentioned above, which in our dynamic time — it interferes with the comfortable use of the system.
Restrictions on the amounts;
Possible requirements for certain documentation for payment confirmation;
Additional fees and commissions, sometimes hidden fees.
It is not always possible to make a transfer quickly and when necessary due to these restrictions. All this confirms the complexity of the operations and additional commissions that the user pays.
And now back to the numbers on the screen, this topic affects not only banks but also centralized cryptocurrency exchanges:
You top up your Deposit on the exchange in cryptocurrency-then you use numbers inside the exchange, and real funds are most often stored on “cold storage” for which administrators or other responsible persons are responsible.
Only when you make a withdrawal from the exchange to your wallet-you are sent real funds (tokens or cryptocurrency).
The same applies to centralized applications and online services that deal with cryptocurrencies: There are many services, both online and apps, that are centralized, regardless of what they will be called: Bitcoin wallet or bitcoin exchange. This means that when you add funds to an account in such a wallet, the funds are stored on the developecompany’s side. In simple words, all your funds are stored in the wallets of the system’s creators. If you use a centralized app, you have a risk of losing funds. Although the application is called cryptocurrency, it does not affect its main principles — it is decentralization. In other words, using systems where there is a Central authority, especially in the cryptocurrency market — the risk increases, so we recommend using decentralized systems for storing currency to reduce risks to a minimum.
Decentralizationis the process of redistributing, dispersing functions, forces, power, people, or things from a Central location or governing body.Centralizationis a condition in which the right to make the most important decisions remains with the highest levels of management.
Using decentralized tools, for example, a local Tkeycoin wallet or a Multi-currency blockchain tkeyspace wallet — Your funds belong only to You and only You can use them, which eliminates the risks of third-party bankruptcy, and such a decentralized architecture can also protect against natural disasters. Given that there is no central server that can be damaged in a natural disaster, the system can work even if there are 2 nodes. https://i.redd.it/sb6i2ladptx41.gif In addition to force majeure situations, you protect your funds from theft and any sanctions from third parties-in our time, this is very important. The owner of Tkeycoin does not need Bank branches, does not need additional verifications, and does not need permission to use, transfer, or even transport Tkeycoin. You can easily carry $1 million worth of Tkeycoin in your pocket and even in theory not know any troubles. https://preview.redd.it/uvw9vfyeptx41.png?width=1400&format=png&auto=webp&s=14b89acca3568fdc5eb82d986aaa2710219ced91 Besides, it is extremely convenient and safe to store even multibillion-dollar capital in Tkeycoin. Imagine that you have a lot, a lot of money, and you need a safe place to store it. Where do you apply? Of course, the Swiss Bank, Yes, but it can easily freeze your accounts and you can easily lose your savings. In recent years, many banks are actively fighting against gray non-cash funds (including offshore ones), and every month more and more legal proceedings are organized on this basis. https://preview.redd.it/vodiuq5gptx41.png?width=1400&format=png&auto=webp&s=31ab44da4431b0159ef1213db7e37c6fd92d5b0a The fact is that serious money, for the most part, has a gray tinge, and only a tiny fraction of billions and millions are clean for the law. That is why their owners are often called to court, subjected to pressure, forced to leave the country, and so on. If your money is stored in Tkeycoin, you will not be subjected to such pressure and will avoid the lion’s share of troubles that usually accompany accounts with many zeros. Using peer-to-peer systems — you will not be called by a Bank Manager and require documents or a fraudster who asks for Your card number and SMS for confirmation. This is simply not the case, wallets are encrypted, and using different addresses guarantees privacy. As for fees for transfers, there are no Visa or Mastercard payment systems, as well as additional fees that we discussed above.
How are payments made in the Tkeycoin peer-to-peer payment system?
https://i.redd.it/9ftct10iptx41.gif As soon as you sign a transaction, it is sent to the blockchain and the miners are engaged in its confirmation, for which they take a symbolic Commission. Let’s look at an example, the key rate is $1, the transfer fee will be 0.00001970 TKEY or 0.00000174 TKEY.
Accordingly, commissions are almost zero. In Europe, on average, you will pay $15–20 for a small Bank transfer. For example, now sending 1 million dollars to BTC, You will pay a Commission in the area of ≈3–8 dollars. Just think, 1 million dollars, without restrictions, risks, and sanctions, and most importantly, the transaction will be the available day today, and you paid an average of ≈5 dollars for the transfer.
Transactions in the Tkeycoin blockchain
Now let’s touch on the topic of how a transaction in the blockchain goes. Once you have sent a transaction, it will be available to the Recipient. The transaction takes place instantly and the User sees not” numbers on the screen”, but real funds-cryptocurrency. This is very convenient when you make any transactions and the Recipient needs to make sure that the payment came. In the full node-there is a choice of confirmation blocks — this is the amount after which you can use the received cryptocurrency. When sending, you can select the number of confirmations:
As we can see, you can also set a weekly confirmation if necessary. The minimum recommended number is 3 blocks. by default, the full node (local wallet) has 6 blocks installed. The presence of this number of confirmations ensures that Your block will not be forged and will be accepted by the network. Each new transaction that receives network approval is sent to mempool, where it waits for miners to confirm it. When a miner takes a transaction to include it in the next block, it automatically receives the first confirmation.
Generating blocks in the TKEY network
A block in the TKEY network is generated within 6–10 minutes. the network automatically corrects the complexity and time of block formation. Thousands of transactions or a single transaction can be placed in a block. https://i.redd.it/f9d17k8uptx41.gif Transactions work faster in the TKEYSPACE app because we have already enabled new algorithms and this is now the fastest and most convenient way to exchange various digital currencies. https://preview.redd.it/nnz5krdvptx41.png?width=1400&format=png&auto=webp&s=fee452ae6389c8f46d97357777193ed2b10bc4bc Anyway, using the full node is also one of the safest ways to store and send Tkeycoin cryptocurrency, and most importantly, the full node stores a full copy of the entire blockchain, which benefits the network and provides protection from information forgery. The more popular the project becomes, the more load is placed on the network itself. For example, 10,000 transactions passed in one block that was processed quickly, while the other 10–20 transactions in another block hung for a longer time, so temporary “pits” may appear. To deal with them, we are working on implementing additional chains-separate chains that are created for cross-transactions, which ensures fast payments under heavy load. For the global system — we get a shipment around the world in 6–10 minutes, in cross-chains in 10 seconds. In comparison with the global payment system, which processes cross — border payments within 3–5 days, this is a huge advantage. If we add liquidity to this, we will get a perfect payment system. https://preview.redd.it/2d0uu4gxptx41.png?width=1200&format=png&auto=webp&s=ae5f2f0bc8b7c7dedd1814eb32e97092a6330c3a Also, you should not forget that if you did not sync with the network and sent a transaction, the transaction may hang in its memory pool and you will have to perform several actions to solve this situation. Here we must understand that syncing with the network is an important point because if you have a connection failure in the Internet Bank, the payment will also not be processed. After all, it will not be sent to a specialist for confirmation. If you are currently experiencing any delays with transactions, this is due to the transition of CPU mining to GPU, as soon as miners switch to new mining methods, the confirmation of blocks will be consistently fast. In conclusion: blockchain is a new technology and many terms, concepts and how it all works are still difficult for many to understand and this is normal from innovation. In many countries, the word cryptocurrency and blockchain are synonymous and no one wants to understand the reality, most people believe that if the blockchain, it means it is related to trading on the cryptocurrency exchange. No one thinks about the real usefulness of certain solutions that will become commonplace for Us in the future. For example, the Internet banking system dates back to the ’80s of the last century, when the Home Banking system was created in the United States. This system allowed depositors to check their accounts by connecting to the Bank’s computer via their phone. In the future, as the Internet and Internet technologies develop, banks are beginning to introduce systems that allow depositors to get information about their accounts via the Internet. For the first time, the service of transferring funds from accounts was introduced in 1994 in the United States by the Stanford Federal Credit Union, and in 1995 the first virtual Bank was created — Security First Network Bank. But, to the disappointment of the founders of the project, it failed because of strong distrust from potential customers, who, at that time, did not trust such an innovation. Only in 2001, Bank of America became the first among all banks that provide e-banking services, the whole user base for this service exceeded 2 million customers. At that time, this figure was about 20 % of all Bank customers. And in October of the same year, 2001, and the same Bank of America took the bar in 3 million money transfers made using online banking services for a total amount of more than 1 billion US dollars. Currently, in Western Europe and America, more than 50% of the entire adult population uses e-banking services, and this figure reaches 90% among adult Internet users. Life changes, and in the bustle of everyday work — we do not even notice how quickly all processes change. We are experiencing a technological revolution that is inevitable. https://i.redd.it/afcfj3rzptx41.gif
1) What are some challenges you think will come up in 2020? Some of the biggest challenges that we see for 2020 are: Adoption; it is the most difficult, but it has the highest impact for the project, and for the blockchain community at large. Perhaps the biggest challenge within adoption is education. It is critical that people understand how RenVM works, its capabilities, its limitations, where it is meant to be used, and where it is not meant to be used. At times, it has already proved difficult to cut through misinformation/misunderstanding that proliferates throughout the wider community or to explain complex cryptographic concepts. We will be addressing these challenges by producing more education material, releasing audits (when they’re completed), engaging more with other communities, and open-sourcing more of the project. Rolling out updates; to the Darknodes as fixes, improvements, and new features. This is a technical challenge, but also a social challenge. It requires comprehensive testing environments and a focus on backwards compatibility, but it also requires coordination and social agreement amongst hundreds (and potentially thousands) of Darknode owners. To face these challenges, we have already begun building more extensive testing frameworks, auto-update capabilities for the Darknodes, and looking at informal methods of governance (until we settle on a formal one). 2) As far as I understand, you already have a list of companies that will be the first to adopt RenVM and make integration. How do these companies feel about the fact that not all repositories are open and some of the code is still closed (private repo)? Doesn't that scare them off? Do you have plans to go full open source? TL:DR: Yes, absolutely we plan going full open-source and all projects we are in conversation with are aware of the below plan and understand the logic behind it. Our logic and subsequent plan (and the thresholds needed) to go full open-source can be found in this document, if curious: RenVM Open-Source Roadmap. Long Answer: The team at Ren are very strong proponents of the open-source ethos and believe all decentralized protocols need to be made open-source when secure. The Ren team also wants to (a) be competitive in this space, given the hard work and capital invested by the team and the community, and (b) give an appropriate amount of time for security issues to be discovered and fixed before making the codebase available to potentially malicious actors. With that said, all of Ren's codebase barring the RZL sMPC algorithm will be open-sourced at the advent of RenVM Mainnet Subzero which will be launched after our security audit is completed. The RZL sMPC algorithm, however, is what makes RenVM and its sMPC solution so special. This RZL sMPC Algorithm has been pioneered in-house by our development team and can be considered a trade secret. It will, therefore, not be released to the wider public until certain security and capital thresholds within RenVM have been met. We have worked very hard over the last two years; this approach ensures RenVM's security and that the Ren community, team, and investors are rewarded for their patience. Stage 1 | Q1 2020 RenVM Mainnet Security Audit The security audit will verify our RZL sMPC algorithm security, correctness, and functionality under a Non-Disclosure Agreement (NDA). This, the security audit of RenVM, and all other components of RenVM's code-base will be available for the public to review when completed. Stage 2 | Q2 2020 Onward Our RZL sMPC Algorithm will be fully open-sourced to the public when the milestones outlined in this document are met: https://github.com/renproject/ren/issues/2 This document has purposefully been designed for open comment as we encourage any stakeholder to voice their opinion or suggestions (supported by empirically-based evidence, of course). The team will take the feedback and incorporate them into the RenVM Open-Source Roadmap thresholds. The open comment period will end at the formal release of RenVM Mainnet Subzero, at which point the specific security and capital thresholds will be solidified, and presented to the public. If you have suggestions or questions, please do voice them on our Github! https://github.com/renproject/ren/issues/2 3) Any ideas on which DeFi dapps could or should in your opinion use RenVM? Any DeFi app can utilize RenVM. If their users have a desire to trade/lend cross-chain pairs then they could/should use RenVM to do so. With that said, a few potential use cases can be found below:
Cross-Chain Decentralized Exchanges,
Cross-Chain Lending & Leveraging,
Multi-Collateralized Synthetics and Stablecoins (e.g. DAI),
No-Counterparty Risk OTC Desk | An Interoperable Escrow,
We’ll also be releasing a blog that outlines all the potential use cases for RenVM in the coming months, so please do stay tuned! 4) Have you announced a partnership with AZTEC, what are your plans for cooperation with them, are these plans still in force or have your priorities changed? At this stage, our entire focus is on Mainnet SubZero. But, we will definitely be following up on integrating with AZTEC once everything is released, stable, and adopted. 5) Can you expand on Universal Interoperability and how important of a role it will play in the future, what are the qualifications of being that third party? TL:DR: The ultimate goal of Universal Interoperability is to ensure a great user experience (UX) regardless of what Blockchain they come from or its end destination. In the immediate terms, this means abstracting away confirmation wait times and the need for ETH gas, so someone could use a BTC on a DeFi app (built on Ethereum) and never know it. Long Answer: The number and speed of confirmations inherently depends on the original chain and must be set at the time the chain is admitted into the protocol. RenVM mainnet will wait for 6 confirmations for BTC (Chaosnet, is 2). This obviously takes a long time and, while it’s not so bad for some use cases (lending, collateralization, etc), it’s not the best for dApps/DEXs and general UX. So, we have the concept of Universal Interoperability which allows a third party to provide two things (in exchange for a fee nominated by users): 1) Expedited Confirmations | Confirmation as a Service (CaaS) CaaS= Expedited Confirmations | This removes confirmations wait time for users when minting digital assets on Ethereum. It provides speed by taking on the confirmation risk. The third-party sees you have (let’s say) 1 confirmation and is confident you’re not a miner about to attack Bitcoin. They come in and provide the shifted BTC immediately to complete whatever action you were taking, and when the real underlying shift finishes the 3rd party get their funds back. By implementing CaaS, developers can help users complete actions faster by using funds that have already been shifted. These funds can be accessed in a variety of trustful and trustless ways, however the goal is the same - facilitate a cross-chain transaction in a shorter time than it would take the user to first fully shift in an asset (i.e. for BTC, RenVM waits for six (6) confirmations). 2) GSN Integration | Gas as a Service (GaaS) GaaS = GSN Integration | This removes the need for users to interact with ETH gas when dealing with native blockchain interactions. It provides gas so you don’t need to manage lots of different tokens, just the ones you’re actually using for the dApp.By Implementing GaaS, this allows dApps to pay for their users' transactions in a secure way, so users don’t need to hold ETH to pay for their gas or even set up an account. This can be particularly valuable when it comes to cross-chain applications as you might be building for a non-ethereum user base. *We'll be releasing a demo if these working the real-world quite soon along with tutorials for 3rd parties to use these solutions if they so choose. 6) What’s the path forward for more liquidity on the REN token? Currently US users are limited in where they can purchase tokens and cannot easily acquire enough to get bonding for even one Darknode. We encourage those who are restricted based on their jurisdiction to utilize the decentralized exchange infrastructure currently available. We try to practice what we preach and by utilizing DeFi, it's a great way to further propel the space into the mainstream. We can’t legally recommend specific exchanges and we don't publicly discuss potentially listing until they are public, but do know we are always working on bringing more democratic access to REN. 7) How does your company plan to make money in the future (to finance the development, when the money received on the ICO will be over)? Our team’s incentives are directly aligned with that of our community (those who run Darknodes). The organization and its funding is centered around Darknode rewards. Darknodes earn fees for facilitating interoperability via RenVM and this is how the organization will fund itself. 8) How does the audit go, any major issue has been identified that could delay MN subzero? When is it estimated to complete an audit? The audit is going well. The smart contracts have been finished, and all issues were addressed quickly. The Hyperdrive audit is currently underway, and there have been no critical issues reported so far. The next steps are to scope the audit for the RZL sMPC paper and its accompanying implementation (the z0 engine). There are no timeline estimates that we are comfortable giving to the public at this stage, as audits times can vary a lot depending on what is found, and an audit of an sMPC implementation is not common (estimates quickly propagate through the community and become incorrectly interpreted as hard commitments). 9) How does your sMPC algorithm work? Can't find any description anywhere. Can the Darknodes perform any calculations over any data splitted using SSS? How fast are those calculations performed? How is the new private key generated for the next era, so old nodes that generated this key does not have access to it? Also, What kind of help from external developers do you need right now? - It takes many pages of very formal discussion to describe how our sMPC algorithm works, but we are working on a paper that formally defines the algorithm, and proves its properties. This paper is being audited, and both the paper and audit will be released to the public after the release of Mainnet.- Darknodes can, in theory, perform any computation over Shamir secret shared (SSS) data, but they are only configured to perform interoperability related computation at the moment (key generation, and key signing). - The performance of general computation over SSS data is very dependent on the kind of computation, however, sMPC is invariably orders of magnitude slower than the equivalent computations running on your local machine (because they involve network communication). - Every epoch a new key must be generated to store assets (and assets in the old key must be transferred to this new key). The old group of Darknodes can generate the new key in such a way that the public key is known, but the private key shares are encrypted for the new group of Darknodes (this process does not reveal any of the new shares to any of the old Darknodes). Under the hood, this uses very simple homomorphic properties. Once the new key is generated, the old Darknodes can simply do their usual sMPC to transfer all assets to the new key. -We would love it if the developer community started experimenting with our SDKs and contributed their thoughts/improvement to RenVM (and the dev infrastructure that supports it e.g. RenJS & GatewayJS) via: https://github.com/renproject/ren/issues 10) What are the plans for the initial network bootstrapping to onboard darknodes to achieve sufficient decentralization and deliver on the security benefits? I understand the early stages of the network will have the core nodes of the Ren Team and trusted partners responsible for maintaining the integrity of the network - do you intend to remain in this phase until sufficient transaction volume is on the network that attracts sufficient 3rd party operators? Are there plans to incentive that initial volume? We intend to remain in the Mainnet SubZero phase until there is sufficient volume (stable over a reasonable period of time) to attract members of the public to run Darknodes and earn rewards by doing so. During this period, Ren and other projects will operate Darknodes to keep the networking running (and to keep it semi-decentralized amongst Ren and these third-parties). It is important for the security of the network that volume grows naturally, otherwise, it risks dropping after the incentivization ends. However, to begin with, we will support volume by providing liquidity to DEXs, and keeping minting fees low. Thank everyone for contributing to our first AMA! We'll have more over the coming months but as always, if you have any questions just come and ask in our Telegram: https://t.me/renproject
Establishing a smart contract commercial scenario: Chainlink, Zk-Snarks and sharding technology work together to make the ultimate killer
This text was translated from Chinese, open following link in Chrome and translate to see all images: https://bihu.com/article/1242138347 EDIT: found an English text with pictures: https://medium.com/@rogerfeng/making-smart-contracts-work-for-business-how-chainlink-zk-snarks-sharding-finally-delivered-8f268af75ca2 Author: Feng Jie translation: Liu Sha “The highest state of technology is to integrate into the various scenes of everyday life, to fade away from high-tech outerwear and become a part of everyday life.” – Mark Weiser People in the future will not even think that smart contracts are "innovative." By that time, smart contracts would permeate every aspect of life, and people couldn't even imagine what the era of non-digital currency would look like. Later historians may divide human business history into two eras, the pre-smart contract era and the post-smart contract era. After all, digital money has brought unprecedented changes to the nature and patterns of business practices in the real world. An anonymous member of the Chainlink community once said: "Smart contracts can change the DNA of the business." Of course, like all the technological revolutions of the past, smart contracts also need to reach a "tipping point" to truly achieve large-scale applications. So we need to ask ourselves two questions:
What exactly is this so-called tipping point?
As of August 2019, have we reached this tipping point?
To reach the tipping point means unlocking the ultimate nirvana of business. Tipping point We can think about this issue from the perspective of mainstream companies. Imagine what a perfect smart contract platform should look like. What characteristics should this platform have? Or what features must be possessed? To reach the tipping point, you must establish a public chain with the following four characteristics:
In addition to the cryptocurrency, the transaction can also be settled in mainstream legal currency and comply with the regulatory requirements of financial markets such as ISO 20022.
Achieve scalability without sacrificing decentralization or security, that is, solving the "impossible triangle problem."
Connect the external data under the chain, that is, solve the "prophecy problem."
Now that we have Chainlink, zk-snarks and sharding technology, we have reached this tipping point. Next, let's explore how this ultimate nirvana is actually made. Our discussion will be mainly from the perspective of Ethereum, which is still the top smart contract platform for community size and mainstream applications. So what about the private chain? Before delving into it, I want to take the time to solve an unavoidable problem. The mainstream view has always believed that the private chain is a more suitable solution for the enterprise. Therefore, we first dialectically analyze the two advantages and two major drawbacks of the private chain. Disadvantages
Centralization leads to relatively lower security
It's not surprising that IBM and Maersk's blockchain freight alliances have a hard time finding customers who are willing to join. How can other freight companies be willing to let their biggest competitors (Maersk) verify their trading data? Only madmen dare to do this.
The staking of the horses occupy the hills:
This problem is even more serious than centralization. John Wolpert, co-founder of the IBM blockchain, wrote an excellent article called Breaking the Barriers to Realize Security: Why Companies Should Embrace the Ethereum Public Chain, which he covered in detail in the article. If every company builds its own private chain, it will lead to chaos in the mountains. Today's B2B ecosystem is very complex. Imagine the innumerable private chains of the world intertwined to form a huge "spider web." This is not only cost-effective, but also not scalable. The starting point of the blockchain is to break down barriers instead of building more barriers. "One day, one of your big buyers called you to ask if you want to join their private chain. You promised. The next day you received a call from the wholesaler to ask you the same question. Then came the supplier, freight. Business, insurance company or even bank, and each company may have several private chains! Finally you have to invest a lot of time and cost to operate dozens of blockchains every day . If there are partners to let you join them at this time The private chain, you might say "Forget it, or fax me the order!" ”—Paul Brody (Ernst & Young) “Every time you connect two private chains through a system integrator, you have to pay a lot of money .” Advantage
Scalability: With the Ethereum public chain implementing fragmentation technology, this advantage is rapidly shrinking.
Privacy protection: At this stage, the classification of public chain / private chain is actually not very accurate. The Aztec , Zether, and Nightfall protocols (both based on the zk-snarks protocol) effectively provide a "private chain model" for the Ethereum public chain, allowing it to switch between the public and private chains. Therefore, a more accurate classification should be the alliance chain and the public chain.
By 2020, the label of the public chain/private chain will gradually disappear. The public and private chains will no longer be two opposing concepts. Instead, the concept of publicly traded/private transactions and confidential contracts/open contracts is changed, and the scope of these transactions and contracts varies according to specific needs, either bilaterally or multilaterally or even publicly. All in all, the private chain has two major drawbacks compared to the public chain. Not only that, but the two major advantages of the private chain are also rapidly disappearing. “Technology will evolve over time, so there will be a variety of solutions to solve existing problems. Ultimately, the public-chain platform will have the same performance, scalability and data privacy as the private chain, while at the same time ensuring security and Decentralized." Feature 1: Privacy protection (predictive machine and public chain privacy) Enigma founder Guy Zyskind once joked in his MIT graduation thesis that smart contracts can only become commercially valuable if they become "confidential contracts." He later proposed that zk-snarks and Trusted Execution Environment (TEE) are the most promising solutions. He said nothing wrong. What is zk-snarks ? Zk-snarks is a zero-knowledge proof mechanism (ZPK). So what is the zero-knowledge proof mechanism? In short: a zero-knowledge proof mechanism allows you to prove that you own certain information without revealing the content of the information. Vitalik Buterin explained this concept in detail from a technical point of view in an article published in 2017. Hackernoon also wrote an excellent article explaining the concept in an easy-to-understand way with the example of a five-year-old child and Halloween candy. What is the trusted execution environment? The trusted execution environment lets the code run on closed hardware, and 1 ) The guarantee result cannot be tampered with 2 ) Protecting absolute privacy, even hardware running code can't get confidential information. The most well-known trusted execution environment is Intel SGX. Chainlink has established a partnership with Intel SGX after acquiring Tom Crier. Ernst & Young released the Nightfall agreement on Github on May 31, 2019. A well-known accounting firm with a history of 100 years will choose to add privacy features to the public chain instead of developing a private chain. This is a problem. Since then, the community has been actively developing on this basis, not only to improve the code, but also to develop a plug-and-play Truffle Box for those who are not good at writing code. Blockchain communities and businesses generally rarely collaborate, so these collaborations fully demonstrate the popularity of Nightfall. Prior to this, two zk-snark-based Ethereum public chain privacy protocols were introduced, namely AZTEC (Consensys) and Zether (Stanford, JPMorgan Chase). An obvious trend is slowly taking shape. In the field of oracles, Chainlink uses both zero-knowledge proof and a trusted execution environment to complement each other. Trusted execution environments guarantee data privacy, even for nodes that cannot access data (this feature is critical for bank accounts and API keys). Chainlink is still trying to implement a trusted execution environment, and nodes can access data temporarily, so authentication services are also needed. Although the credible execution environment is almost 100% foolproof, in theory, a strong shield has a spear that can penetrate it. Therefore, the team is currently trying to run zk-snarks in a trusted execution environment (Thomas Hodges mentioned this in the 2019 Trufflecon Q&A session). The combination of the two can form a very robust and complete system. The attacker must find a way to strip all the layers of an onion at the same time to make any effective attack (and it is already difficult to peel off a layer of skin). “Chainlink combines a trusted execution environment with zero-knowledge proof to build what we call a defense-in-depth system, which means they provide all the tools needed for smart contract developers, including trusted execution environments, multiple nodes, and Data sources, fine margins, reputation systems, asymmetric encryption, zero-knowledge proofs, WASM, and OTP+RNG, these features allow smart contract developers to adjust the confidentiality and cost of contracts based on specific budget and security needs. Machine, Chainlink and its four major application scenarios》 In the future, zk-snarks may be upgraded to zk-starks (a fully transparent zero-knowledge proof mechanism) that protects the system from quantum computer attacks. And the best thing about zk-starks is that it's more scalable than zk-snarks. In other words, it can better protect privacy, and the cost of gas will not increase. If you want to learn more about zk-starks, you can read a popular science article written by Adam Luciano. Feature 2: Scalability (scalability of predictive machines and public chains) To understand this problem, we can make an analogy like this: A public chain is like a large enterprise, and every employee (ie, a node) must attend each meeting (ie, confirm each transaction). Imagine how inefficient this company is! Only customers who have a lot of money (ie gas fees) can get their requests to the forefront. And this is not the most serious problem. The most serious problem is that the more employees (ie nodes) who join the company, the harder it is for the company to function properly! In the end, the company not only failed to expand linearly, but also became smaller and smaller. Although this guarantees decentralization and security to the greatest extent, the price is completely abandoning scalability. There are various temporary fire fighting solutions, but no one solution can completely solve this "impossible triangle problem." For example, EOS uses the DPOS mechanism (share authorization certification mechanism), where only 21 super nodes (many of which are well-known nodes) are responsible for verifying all transactions. Sidechains (such as Bitcoin's Lightning Network and Ethereum's lightning network) guarantee scalability and decentralization at the expense of security. So how to use the fragmentation technology to solve this problem? Let's make another analogy: In reality, there is only one company that is not too much to ask everyone to attend all meetings, that is, small start-ups (that is, private chains that limit the number of nodes). In most cases, large companies divide employees into thousands of teams (ie, shards), and each team's principal (ie, the certifier) is responsible for reporting to the senior management (ie, the main chain). If people from different teams need to collaborate (and sometimes also), then they can collaborate by cross-shard receipts. If a new employee joins the company, the team can be re-segmented (ie re-sharding). This allows for linear expansion. In fact, the process of developing a start-up to a large enterprise is surprisingly similar to the process of Ethereum 1.0 developing into Ethereum 2.0. “The Ethereum 1.0 period is that several people who are alone are trying to build a world computer; and Ethereum 2.0 will really develop into a world computer.” Vitalik Buterin said in the first piece of the workshop. Since Ethereum was not originally built on the principle of fragmentation, it takes seven steps to achieve the goal (this is a bit like the word morphing solitaire game). The first step is planned for January 3, 2020. At the same time, developers can use many other blockchain platforms designed based on the fragmentation principle. Some platforms, including Zilliqa and Quarkchain, are already compatible with Chainlink. If you want to see more in-depth technical analysis of shards, check out an article by Ramy Zhang. In the field of oracles, Chainlink has the following two characteristics: 1 ) Use Schnorr threshold signatures to quickly reach consensus in a cost-effective manner. The next version of the chain only needs 16,000 gas. 2 ) We have previously discussed the need to use trusted execution environment hardware to ensure that nodes cannot access sensitive data. Since you have hardware in your hand, you can use it to do some actual computing work, so that you can properly reduce the amount of computation on the smart contract platform. "With the SGX system (Town Crier) and zero-knowledge proof technology, the oracle can be truly reliable and confidential, so the boundaries between the oracle and the smart contract are beginning to flow... Our long-term strategy... is to let The predictor becomes the key chain of computing resources used by most smart contracts. We believe that the way to achieve this goal is to perform chain operations in the oracle to meet various computing needs, and then send the results to the smart contract."Chainlink White Paper, Section 6.3 (26 pages) Of course, this “long-term strategy” has certain risks, unless Chainlink can implement a trusted execution environment and its service provider ecosystem can achieve a qualitative leap. However, the Chainlink team's vision is absolutely forward-looking: under-chain computing is a key factor in ensuring that blockchains are not dragged down by large amounts of IoT data. The Internet of Things has dramatically increased the current state of big data. At present, most of the data is still generated on the software side, and it is not real-time data, and most of the data in the future will be real-time data generated on the sensor side. One of the big drawbacks of real-time data is that it increases storage pressure. For example, Coughlin Associates expects an unmanned car to generate 1G of data per second. This means that the same car will produce 3.6T data per hour! The only viable solution is to do real-time analysis of the data, rather than storing the data first. In the Global Cloud Index: 2016-2021 Forecast and Methodology White Paper, Cisco predicts that more than 90% of data in 2021 will be analyzed in real time without storage. That is to say, the essence of data is that it can only exist in just one instant. The nature of the blockchain is not to be modified, so the two are as incompatible as water and oil. The solution is to analyze the raw data under the chain, extract the meaningful results and send them to the blockchain. The combination of fragmentation technology and trusted execution environment forms a new computing architecture, similar to the cloud computing-fog computing-edge computing architecture. It should be noted here that it is good to improve computing power, but this is not the main purpose of the blockchain. The fundamental purpose of the blockchain is not to reduce the original cost of computing and data storage. After all, technology giants such as Amazon, Microsoft, Google, Salesforce, Tencent, Alibaba, and Dropbox have built world-class cloud services. The centralized server wins high computational efficiency (but the blockchain will greatly improve the computational efficiency through fragmentation technology, and will catch up with it one day). The value of the blockchain is to reduce the cost of building trust. Nick Szabo calls it "social scalability" (this is a relative concept to the "operational" scalability we have been talking about). Vitalik Buterin also made it clear that the meaning of smart contracts is to accept small arithmetic delay penalties in exchange for a substantial reduction in "social costs." Alex Coventry of the Chainlink team once raised the question: "We have missed many opportunities for cooperation and reciprocity because we can't confirm whether the other party will fulfill the promise?" Is there any potential for data storage projects like Siacoin and IPFS? What about decentralized computing projects like SONM and Golem? Siacoin 's core value proposition is not that its computing efficiency is higher than traditional cloud services. The cost of computing is required to split, repeat, and reassemble data. And companies are more capable of buying the latest and greatest hardware than individuals. Siacoin's core value proposition is to process data in an Airbnb-like mode, so management fees will be lower than traditional models. It also generates additional social value, such as flood control, privacy and security, and anti-censorship. The same is true of Golem and SONM. Even with the most efficient protocol, it is inevitable that a small amount of delay will be imposed and fined to coordinate the hardware of different geographical locations. Therefore, under the condition that all other conditions are equal, the centralized hardware still has the advantage of faster computing speed. However, the core value proposition of the above project is to use the Airbnb-like model to reduce management costs. We must strictly distinguish between "social scalability" and "operational scalability", and the two cannot be confused. I will explain these two concepts in detail when I discuss "Magic Bus and Lightweight Library" later. Feature 3: Compatible with legal currency Most mainstream companies do not regard cryptocurrencies as "real currencies." In addition, even if someone wants to use cryptocurrency for trading, it is very difficult to actually operate because of its high price volatility. I discussed the “price volatility problem” in detail in Chapters 8 and 9 of the previous article. These problems do not completely erase the existence value of cryptocurrencies, because cryptocurrencies also have many advantages that legal currency does not have. I am just emphasizing what we need to know more about the comfort zone of mainstream companies. Chainlink acts as a universal API connector that triggers open banking payments. Chainlink is fully compliant with ISO 20022 and has established a long-term partnership with SWIFT (it is worth mentioning that SWIFT has not been updated for a long time and hopes to be updated after the SIBOS 2019 conference). PSD2 will take effect on September 14, 2019. All banks in the EU will all comply with this new regulation by then. In other words, the bank must put all account data in the "front end" and can be called through the API. The approved third party (ie, the Chainlink node) can trigger the payment directly without the payment service provider. Although the United States and Japan have not adopted similar laws, many banks still spontaneously promote the development of open banks. Banks open APIs to third-party developers to create new revenue streams and customer experiences that ultimately increase profitability. In addition, this will allow banks to better respond to competitors in the mobile payment and financial technology sectors in an APP-centric economic model. As this open banking revolution continues, Chainlink will connect smart contracts with the world's major currencies (US dollar, euro, yen, etc.). Only one external adapter is required to connect to the authenticated API. From a programming perspective, it is relatively simple to allow everyone in the community to contribute code to the code base (and thus achieve scalability). Chainlink has released adapters for PayPal and Mister Tango (European version of PayPal). Feature 4: Data connection with the chain Chainlink has been working on solving the "prophecy problem" and successfully succeeded on the main online line on May 30, 2019. Chainlink has made many achievements in just a few months. Provable (formerly Oraclize) was successfully used on the Chainlink node and finally settled the debate about whether the predictor should be centralized or decentralized. Synthetic Ether lost 37 million Ethercoins in a hack because it did not connect to Chainlink. Fortunately, the money was finally recovered and did not cause any loss. This lesson illustrates the importance of decentralized oracles. In addition, both Oracle and Google have partnered with Chainlink to monetize their API data and create a virtuous circle to capture the market opportunities that Facebook missed. There are new nodes coming online every week, and the network activity has been very high. The Chainlink team maintains a list of certified nodes in the documentation and Twitter releases. Twitter user CryptoSponge also set up a new development for the Tableau push update Chainlink team: Regarding the importance of the current stage in the history of blockchain development, Brad Huston summed it up very brilliantly: "The biggest problem with cryptocurrencies is to build bridges between cryptocurrencies, fiat currencies and big data. Chainlink is very beautifully narrowing the distance between the three. Now it can even be said: 'The bridge has been built.'" Magic bus and lightweight library Let's summarize what we discussed earlier. The real purpose of the blockchain is to reduce the cost of building trust and achieve "social scalability." Therefore, according to this logic, the main application scenarios of platforms such as Ethereum 2.0 and Zilliqa should be in the B2B field. I quote a sentence I wrote in a previous article: “My conclusion is: If the smart contract is successful, it will also succeed in the B2B field first.” The private chain itself is self-contradictory and destined to fail. It has led to the phenomenon of occupying the hills, thus increasing the social cost, which is in opposition to B2B itself, and ultimately it is self-restraint. ” Before the emergence of fragmentation technology, even simple games (ie, etheric cats) could not be smoothly run on the public chain, let alone dealing with complex B2B contracts and even changing commercial DNA. With the sharding technology, everything is ready. Despite this, we can't use Ethereum 2.0 as an all-powerful platform. Just now we said that although it is a good thing to speed up the calculation, this is not the real purpose of Ethereum 2.0. And before we also said that due to the irreversible modification of the blockchain, it is not good to deal with a large number of fleeting real-time data of the Internet of Things. In other words, we must be soberly aware that Ethereum 2.0 will not replace traditional web 2.0. Instead, we should make better use of the real advantages of Ethereum 2.0: “There is a new concept now, that is to think of the Ethereum main network as a global bus... We use the Ethereum 2.0 main network to treat various business resources as a working group on Slack: it can be easily built and integrated. And restructuring. The SAP inventory management system in your company, the dealer's JD Edwards ERP system, and the financial technology partner's tall blockchain system can seamlessly interface, eliminating the need to develop an infrastructure specifically for each partner." - John Wolper describes his ideal "magic bus" Ethereum 2.0 should be an integration center, not a data center or computing center. It should be a library built specifically to store B2B contract terms (to be honest, even with fragmentation technology, the amount of data is large enough). We should not expect Ethereum 2.0 to be an all-powerful platform, but rather develop it into a "lightweight library." If we reorder the pyramid model just now, the architecture of the magic bus is obvious: Of course, the positional relationship in the above model is not static. With the development of 5G technology, edge computing and IoT sensors, they may bypass the cloud to directly interact (or even bypass the fog end). If the collaboration between Iotex and Chainlink is successful, then the edge can interact directly with the trusted execution environment. Time will tell if Airbnb's shared data storage and computing model can make management costs lower than the current mainstream Web 2.0 model. Time will also prove whether the market really needs anti-censorship, anti-tampering, security protection and privacy protection. Do users really care about these social values and are willing to pay for them? Do they think these are just the icing on the cake or the most fundamental value? in conclusion Whether it is the battle between web2.0 and web3.0 or the battle between cryptocurrency and legal currency, one thing is beyond doubt: We have reached the tipping point, and the era of smart contracts with commercial value has arrived. In fact, the only problem at the moment is the time issue, and the main roadblocks have been basically cleared.
When will Ethereum 2.0 finish these 7 stages and be officially released?
When will Chainlink use a trusted execution environment on a large scale? If the cooperation between Intel SGX and Town Crier fails, what alternative plans are there? Will Chainlink communicate with other blockchain teams that plan to use a trusted execution environment (such as Dawn Song's Oasis Labs)?
At present, the main technical problems in the ecosystem have been solved, and now it is only necessary to recruit a group of enthusiastic developers to do the work of “connecting the line”. Digital currency has changed commercial DNA, and the future is full of possibilities. The only thing that hinders us now is our own imagination. The future is infinitely imaginative, and the future will be the world of developers. Dapps is already overwhelming. There is no doubt that we have found the ultimate nirvana. This text was translated from Chinese, open following in Chrome and translate to see all images: https://bihu.com/article/1242138347
The report produced by the fire block chain coins Institute, author: Yuan Yuming, Hu Zhiwei, PDF version please read the original text download Summary The Fire Coin Blockchain Application Research Institute conducts research on distributed ledger technology based on directed acyclic graph (DAG) data structure from a technical perspective, and through the specific technical test of typical representative project IOTA, the main research results are obtained:
Compared with the narrowblockchaintechnology,DAGtechnology has its own innovations. In theory, it can achieve high scalability and high decentralization without considering the evil situation. Therefore, there are some security risks.
DAGprojects tend to operate in a decentralized manner, which reduces the degree of decentralization to improve safety, but at the same time the transaction rate is not close to the theoretical limit.
Using a 40-node network based on VPS (CPU for PoW) to conduct a trading stress test on a representative project IOTA, it can be found that the TPS is very low: the optimal result that can be achieved is 4.19.
After analysis, its performance bottleneck is mainly based on the hardware implementation of the hardware itself.If you use other methods such as FPGA, you should further explore the performance potential of DAG.
At the same time, in order to maintain good transaction processing capabilities, a network of nodes of sufficient scale should be established.
Report body 1 Introduction Blockchain is a distributed ledger technology, and distributed ledger technology is not limited to the "blockchain" technology. In the wave of digital economic development, more distributed ledger technology is being explored and applied in order to improve the original technology and meet more practical business application scenarios. Directed Acylic Graph (hereinafter referred to as "DAG") is one of the representatives. What is DAG technology and the design behind it? What is the actual application effect?We attempted to obtain analytical conclusions through deep analysis of DAG technology and actual test runs of representative project IOTA. It should also be noted that the results of the indicator data obtained from the test are not and should not be considered as proof or confirmation of the final effect of the IOTA platform or project. Hereby declare. 2. Main conclusions After research and test analysis, we have the following main conclusions and technical recommendations:
Compared with the narrow blockchain technology, DAG technology has its own innovations. In theory, it can achieve high scalability and high decentralization without considering the evil situation. Therefore, there are some security risks.
DAG projects tend to operate in a decentralized manner, which reduces the degree of decentralization to improve safety, but at the same time the transaction rate is not close to the theoretical limit.
Using a 40-node network based on VPS (CPU for PoW) to conduct a trading stress test on a representative project IOTA, it can be found that the TPS is very low: the optimal result that can be achieved is 4.19.
After analysis, its performance bottleneck is mainly based on the hardware implementation of the hardware itself.If you use other methods such as FPGA, you should further explore the performance potential of DAG.
At the same time, in order to maintain good transaction processing capabilities, a network of nodes of sufficient scale should be established.
3.DAG Introduction 3.1. Introduction to DAG Principle DAG (Directed Acyclic Graph) is a data structure that represents a directed graph, and in this graph, it cannot return to this point (no loop) from any vertex, as shown in the figure. Shown as follows: 📷 After the DAG technology-based distributed ledger (hereinafter referred to as DAG) technology has been proposed in recent years, many people think that it is hopeful to replace the blockchain technology in the narrow sense. Because the goal of DAG at design time is to preserve the advantages of the blockchain and to improve the shortcomings of the blockchain. Different from the traditional linear blockchain structure, the transaction record of the distributed ledger platform represented by IOTA forms a relational structure with a directed acyclic graph, as shown in the following figure. 📷 3.2. DAG characteristics Due to the different data structure from the previous blockchain, the DAG-based distributed ledger technology has the characteristics of high scalability, high concurrency and is suitable for IoT scenarios. 3.2.1. High scalability, high concurrency The data synchronization mechanism of traditional linear blockchains (such as Ethereum) is synchronous, which may cause network congestion. The DAG network adopts an asynchronous communication mechanism, allowing concurrent writing. Multiple nodes can simultaneously trade at different tempos without having a clear sequence. Therefore, the data of the network may be inconsistent at the same time, but it will eventually be synchronized.
3.2.2. Applicable to IoT scenarios
In the traditional blockchain network, there are many transactions in each block. The miners are packaged and sent uniformly, involving multiple users. In the DAG network, there is no concept of “block”, the smallest unit of the network. It is a "transaction", each new transaction needs to verify the first two transactions, so the DAG network does not need miners to pass the trust, transfer does not require a fee, which makes DAG technology suitable for small payments. 4. Analysis of technical ideas Trilemma, or "trilemma", means that in a particular situation, only two of the three advantageous options can be selected or one of the three adverse choices must be chosen. This type of selection dilemma has related cases in various fields such as religion, law, philosophy, economics, and business management.Blockchain is no exception. The impossible triangle in the blockchain is: Scalability, Decentralization, and Security can only choose two of them. If you analyze DAG technology according to this idea, according to the previous introduction, then DAG has undoubtedly occupied the two aspects of decentralization and scalability. The decentralization and scalability of the DAG can be considered as two-sided, because of the asynchronous accounting features brought about by the DAG data structure, while achieving the high degree of decentralization of the participating network nodes and the scalability of the transaction. 5. There is a problem Since the characteristics of the data structure bring decentralization and scalability at the same time, it is speculated that the security is a hidden danger according to the theory of impossible triangles. But because DAG is a relatively innovative and special structure, can it be more perfect to achieve security? This is not the case from the actual results. 5.1. Double flower problem The characteristics of DAG asynchronous communication make it possible for a double-flower attack. For example, an attacker adds two conflicting transactions (double spending) at two different locations on the network, and the transactions are continuously forward-checked in the network until they appear on the verification path of the same transaction, and the network discovers the conflict. At this time, the common ancestor nodes that the two transactions are gathered together can determine which transaction is a double-flower attack. If the trading path is too short, there will be a problem like "Blowball": when most transactions are "lazy" in extreme cases, only the early trading, the trading network will form a minority. Early transactions are the core central topology. This is not a good thing for DAGs that rely on ever-increasing transactions to increase network reliability. Therefore, at present, for the double flower problem, it is necessary to comprehensively consider the actual situation for design. Different DAG networks have their own solutions. 5.2. Shadow chain problem Due to the potential problem of double flowers, when an attacker can build a sufficient number of transactions, it is possible to fork a fraudulent branch (shadow chain) from the real network data, which contains a double flower transaction, and then this The branch is merged into the DAG network, and in this case it is possible for this branch to replace the original transaction data. 6. Introduction to the current improvement plan At present, the project mainly guarantees safety by sacrificing the native characteristics of some DAGs. The IOTA project uses the Markov chain Monte Carlo (MCMC) approach to solve this problem. The IOTA introduces the concept of Cumulative Weight for transactions to record the number of times the transaction has been cited in order to indicate the importance of its transaction. The MCMC algorithm selects the existing transactions in the current network as a reference for the newly added transactions by weighting the random weights of the accumulated weights. That is, the more referenced the transaction path, the easier it is to be selected by the algorithm. The walk strategy has also been optimized in version 1.5.0 to control the "width" of the transaction topology to a reasonable range, making the network more secure. However, at the beginning of the platform startup, due to the limited number of participating nodes and transactions, it is difficult to prevent a malicious organization from sending a large number of malicious transactions through a large number of nodes to cause the entire network to be attacked by the shadow chain. Therefore, an authoritative arbitration institution is needed to determine the validity of the transaction. In IOTA, this node is a Coordinator, which periodically snapshots the current transaction data network (Tangle); the transactions contained in the snapshot are confirmed as valid transactions. But Coordinator doesn't always exist. As the entire network runs and grows, IOTA will cancel the Coordinator at some point in the future. The Byteball improvement program features its design for the witness and the main chain. Because the structure of DAG brings a lot of transactions with partial order, and to avoid double flowers, it is necessary to establish a full order relationship for these transactions to form a transaction backbone. An earlier transaction on the main chain is considered a valid transaction.Witnesses, who are held by well-known users or institutions, form a main chain by constantly sending transactions to confirm other user transactions. The above scheme may also bring different changes to the platform based on the DAG structure. Taking IOTA as an example, because of the introduction of Coordinator, the decentralization characteristics are reduced to some extent. 7. Actual operation 7.1. Positive effects In addition to solving security problems, the above solutions can also solve the smart contract problem to some extent. Due to the two potential problems caused by the native features of DAG: (1) The transaction duration is uncontrollable. The current mechanism for requesting retransmission requires some complicated timeout mechanism design on the client side, hoping for a simple one-time confirmation mechanism. (2) There is no global sorting mechanism, which results in limited types of operations supported by the system. Therefore, on the distributed ledger platform based on DAG technology, it is difficult to implement Turing's complete intelligent contract system. In order to ensure that the smart contract can run, an organization is needed to do the above work. The current Coordinator or main chain can achieve similar results. 7.2. Negative effects As one of the most intuitive indicators, DAG's TPS should theoretically be unlimited. If the maximum TPS of the IOTA platform is compared to the capacity of a factory, then the daily operation of TPS is the daily production of the plant. For the largest TPS, the April 2017 IOTA stress test showed that the network had transaction processing capabilities of 112 CTPS and 895 TPS. This is the result of a small test network consisting of 250 nodes. For the daily operation of TPS, from the data that is currently publicly available, the average TPS of the main network in the near future is about 8.2, and the CTPS (the number of confirmed transactions per second) is about 2.7. 📷 The average average TPS of the test network is about 4, and the CTPS is about 3. 📷 Data source discord bot: generic-iota-bot#5760 Is this related to the existence of Coordinator? Actual testing is needed to further demonstrate. 8. Measured analysis The operational statistics of the open test network are related to many factors.For further analysis, we continue to use the IOTA platform as an example to build a private test environment for technical measurement analysis. 8.1. Test Architecture The relationship between the components we built this test is shown below. 📷 among them:
In order to avoid the impact of other factors on the main network or test network on the test results, we built a 40-node small private IOTA test network. The build process refers to the "Private IOTA Testnet" project (https://github.com/schierlm/private-iota-testnet.git) and updates the IRI (IOTA Reference Implementation) version used in it to the latest v1.5.0 release. In addition, during the test, the Coordinator simulation tool provided by the tool is used to periodically generate snapshots to confirm the transaction.
Use Python-based open source load testing tool Locust (https://locust.io/) to control the sending of transaction data. After the test is started, the transaction data is randomly sent to the nodes in all private networks.
When the node is started, the "--zmq-enabled" parameter is added to start the ZeroMQ support of the IOTA, and the transaction data obtained by the node is sent to the observing end through the message queue. Use the ZeroMQ event parsing script (https://github.com/lunfardo314/tps\_zmq.git) on the observing side to observe the node transaction data.
8.2. Testing the hardware environment The server uses Amazon AWS EC2 C5.4xlarge: 16 core 3GHz, Intel Xeon Platinum 8124M CPU, 32GB memory, 10Gbps LAN network between servers, communication delay (ping) is less than 1ms, operating system is Ubuntu 16.04. 8.3. Test scenarios and results analysis
8.3.1. Default PoW Difficulty Value
Although there is no concept such as “miners”, the IOTA node still needs to prove the workload before sending the transaction to avoid sending a large number of transactions to flood the network. The Minimum Weight Magnitude is similar to Bitcoin. The result of PoW should be the number of digits of "9", 9 of which is "000" in the ternary used by IOTA. The IOTA difficulty value can be set before the node is started. Currently for the production network, the difficulty value of the IOTA is set to 14; the test network is set to 9. Therefore, we first use the test network's default difficulty value of 9 to test, get the following test results. 📷 Since each IOTA's bundle contains multiple transfers, the actual processed TPS will be higher than the send rate. But by executing the script that parses zmq, it can be observed that the current TPS is very low. Another phenomenon is that the number of requests that can be sent successfully per second is also low. After analysis, the reason is that the test uses VPS, so in PoW, the CPU is mainly used for calculation, so the transaction speed is mainly affected by the transmission speed.
8.3.2. Decrease the PoW difficulty value
Re-test the difficulty value to 1 and get the following results. 📷 As can be seen from the results, TPS will increase after the difficulty is reduced. Therefore, the current TPS of the IOTA project does not reach the bottleneck where the Coordinator is located, but mainly because of the hardware and network of the client itself that sends the transaction. The IOTA community is currently working on the implementation of FPGA-based Curl algorithm and CPU instruction set optimization. Our test results also confirm that we can continue to explore the performance potential of the DAG platform in this way.
In the past weeks I heard a lot pros and cons about IOTA, many of them I believe were not true (I'll explain better). I would like to start a serious discussion about IOTA and help people to get into it. Before that I'll contribute with what I know, most things that I will say will have a source link providing some base content.
The pros and cons that I heard a lot is listed below, I'll discuss the items marked with *. Pros
Many users claim that the network infinitely scales, that with more transactions on the network the faster it gets. This is not entirely true, that's why we are seeing the network getting congested (pending transactions) at the moment (12/2017). The network is composed by full-nodes (stores all transactions), each full-node is capable of sending transactions direct to the tangle. An arbitrary user can set a light-node (do not store all transactions, therefore a reduced size), but as it does not stores all transactions and can't decide if there are conflicting transactions (and other stuff) it needs to connect to a full-node (bitifinex node for example) and then request for the full-node to send a transaction to the tangle. The full-node acts like a bridge for a light-node user, the quantity of transactions at the same time that a full-node can push to the tangle is limited by its brandwidth. What happens at the moment is that there are few full-nodes, but more important than that is: the majority of users are connected to the same full-node basically. The full-node which is being used can't handle all the requested transactions by the light-nodes because of its brandwidth. If you are a light-node user and is experiencing slow transactions you need to manually select other node to get a better performance. Also, you need to verify that the minimum weight magnitude (difficulty of the Hashcash Proof of Work) is set to 14 at least. The network seems to be fine and it scales, but the steps an user has to make/know are not friendly-user at all. It's necessary to understand that the technology envolved is relative new and still in early development. Do not buy iota if you haven't read about the technology, there is a high chance of you losing your tokens because of various reasons and it will be your own fault. You can learn more about how IOTA works here. There are some upcoming solutions that will bring the user-experience to a new level, The UCL Wallet (expected to be released at this month, will talk about that soon and how it will help the network) and the Nelson CarrIOTA (this week) besides the official implementations to come in december.
We all know that currently (2017) IOTA depends on the coordinator because the network is still in its infancy and because of that it is considered centralized by the majority of users. The coordinator are several full-nodes scattered across the world run by the IOTA foundation. It creates periodic Milestones (zero value transactions which reference valid transactions) which are validated by the entire network. The coordinator sets the general direction for the tangle growth. Every node verifies that the coordinator is not breaking consensus rules by creating iotas out of thin air or approving double-spendings, nodes only tells other nodes about transactions that are valid, if the Coordinator starts issuing bad Milestones, nodes will reject them. The coordinator is optional since summer 2017, you can choose not implement it in your full-node, any talented programmer could replace Coo logic in IRI with Random Walk Monte Carlo logic and go without its milestones right now. A new kind of distributed coordinator is about to come and then, for the last, its completely removal. You can read more about the coordinator here and here.
These are blockchain-based cryptocurrencies (Bitcoin) that has miners to guarantee its security. Satoshi Nakamoto states several times in the Bitcoin whitepaper that "The system is secure as long as honest nodes collectively control more CPU power than any cooperating group of attacker nodes". We can see in Blockchain.info that nowadays half of the total hashpower in Bitcoin is controlled by 3 companies (maybe only 1 in the future?). Users must trust that these companies will behave honestly and will not use its 50%> hashpower to attack the network eventually. With all that said it's reasonable to consider the IOTA network more decentralized (even with the coordinator) than any mining-blockchain-based cryptocurrency You can see a comparison between DAG cryptocurrencies here
Some partnerships of IOTA foundation with big companies were well known even when they were not officialy published. Some few examples of confirmed partnerships are listed below, others cofirmed partnerships can be seem in the link Partnerships with big companies at the pros section.
So what's up with all alarming in social media about IOTA Foundation faking partnerships with big companies like Microsoft and Cisco? At Nov. 28th IOTA Foundation announced the Data Marketplace with 30+ companies participating. Basically it's a place for any entity sell data (huge applications, therefore many companies interested), at time of writing (11/12/2017) there is no API for common users, only companies in touch with IOTA Foundation can test it. A quote from Omkar Naik (Microsoft worker) depicted on the Data Marketplace blog post gave an idea that Microsoft was in a direct partnership with IOTA. Several news websites started writing headlines "Microsoft and IOTA launches" (The same news site claimed latter that IOTA lied about partnership with Microsoft) when instead Microsoft was just one of the many participants of the Data Marketplace. Even though it's not a direct partnership, IOTA and Microsoft are in close touch as seen in IOTA Microsoft and Bosch meetup december 12th, Microsoft IOTA meetup in Paris 14th and Microsoft Azure adds 5 new Blockchain partners (may 2016). If you join the IOTA Slack channel you'll find out that there are many others big companies in close touch with IOTA like BMW, Tesla and other companies. This means that right now there are devs of IOTA working directly with scientists of these companies to help them integrate IOTA on their developments even though there is no direct partnership published, I'll talk more about the use cases soon.
We are excited to partner with IOTA foundation and proud to be associated with its new data marketplace initiative... - Omkar Naik
IOTA's use cases
Every cryptocurrency is capable of being a way to exchange goods, you pay for something using the coin token and receive the product. Some of them are more popular or have faster transactions or anonymity while others offers better scalablity or user-friendness. But none of them (except IOTA) are capable of transactioning information with no costs (fee-less transactions), in an securely form (MAM) and being sure that the network will not be harmed when it gets more adopted (scales). These characteristics open the gates for several real world applications, you probably might have heard of Big Data and how data is so important nowadays.
Data sets grow rapidly - in part because they are increasingly gathered by cheap and numerous information-sensing Internet of things devices such as mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks.
It’s just the beginning of the data period. Data is going to be so important for human life in the future. So we are now just starting. We are a big data company, but compared to tomorrow, we are nothing. - Jack Ma (Alibaba)
There are enormous quantities of wasted data, often over 99% is lost to the void, that could potentially contain extremely valuable information if allowed to flow freely in data streams that create an open and decentralized data lake that is accessible to any compensating party. Some of the biggest corporations of the world are purely digital like Google, Facebook and Amazon. Data/information market will be huge in the future and that's why there so many companies interested in what IOTA can offer. There are several real world use cases being developed at the moment, many of them if successful will revolutionize the world. You can check below a list of some of them.
Not having your wallet set up properly (min weight 14, etc.)
Problems that could be easily avoided with a better understand of the network/wallet or with a better wallet that could handle these issues. As I explained before, some problems during the "congestion" of the network could be simply resolved if stuff were more user-friendly, this causes many users storing their iotas on exchanges which is not safe either. The upcoming (dec 2017) UCL Wallet will solve most of these problems. It will switch between nodes automatically and auto-reattach transactions for example (besides other things). You can have full a overview of it here and here. Also, the upcoming Nelson CarrIOTA will help on automatic peer discovery for users setup their nodes more easily.
IOTA Vulnerability issue
On sept 7th 2017 a team from MIT reported a cryptographic issue on the hash function Curl. You can see the full response of IOTA members below.
Funds were never in danger as such scenarios depicted on the Neha's blogpost were not pratically possible and the arguments used on the blogpost had'nt fundamentals, all the history you can check by yourself on the responses. Later it was discovered that the whole Neha Narula's team were envolved in other concurrent cryptocurrency projects Currently IOTA uses the relatively hardware intensive NIST standard SHA-3/Keccak for crucial operations for maximal security. Curl is continuously being audited by more cryptographers and security experts. Recenlty IOTA Foundation hired Cybercrypt, the world leading lightweight cryptography and security company from Denmark to take the Curl cryptography to its next maturation phase.
It took me a couple of days to gather the informations presented, I wanted it to make easier for people who want to get into it. It might probably have some mistakes so please correct me if I said something wrong. Here are some useful links for the community.
This is my IOTA donation address, in case someone wants to donate I will be very thankful. I truly believe in this project's potential. I9YGQVMWDYZBLHGKMTLBTAFBIQHGLYGSAGLJEZIV9OKWZSHIYRDSDPQQLTIEQEUSYZWUGGFHGQJLVYKOBWAYPTTGCX
This is a donation address, if you want to do the same you might pay attention to some important details:
Create a seed for only donation purposes.
Generate a address and publish it for everyone.
If you spend any iota you must attach a new address to the tangle and refresh your donation address published before to everyone.
If someone sends iota to your previous donation address after you have spent from it you will probably lose the funds that were sent to that specific address.
You can visualize how addresses work in IOTA here and here.
This happens because IOTA uses Winternitz one-time signature to become quantum resistent. Every time you spend iota from a address, part of the private key of that specific address is revealed. This makes easier for attackers to steal that address balance. Attackers can search if an address has been reused on the tangle explorer and try to brute force the private key since they already know part of it.
The New Crypto Order & Escaping Financial Repression
The Vigilante’s View It is our first issue in months that bitcoin hasn’t hit an all-time high! And it’s the last issue of the year. And what a year for cryptos it was. To put it in perspective, bitcoin could fall 90% from current levels and it will still have outperformed stocks, bonds and real estate in 2017. Bitcoin started 2017 at $960.79. At the time of this writing it is near $13,000 for a gain of 1,250% in 2017. And, bitcoin was actually one of the worst performing cryptocurrencies in our TDV portfolio in 2017! Ethereum (ETH) started 2017 at $8. It has since hit over $800 for a nice 10,000% gain in 2017. That’s pretty good, but not as good as Dash which started the year at $11.19 and recently hit $1,600 for a nearly 15,000% gain. I hope many of you have participated in these amazing gains! If not, or you are new, don’t worry there will be plenty more opportunities in the years ahead. It won’t all be just home runs though… in fact, some of the cryptos that have performed so well to date may go down dramatically or collapse completely in the coming years. I’ll point out further below why Lightning Network is not the answer to Bitcoin Core’s slow speeds and high costs. And, I’ll look ahead to 2018 and how we could already be looking beyond blockchains. Yes, things are moving so fast that blockchain just became known to your average person this year… and could be nearly extinct by next year. That’s why it is important to stick with us here at TDV to navigate these choppy free market waters! New Years Reflection On The Evolution Of Consensus Protocols Sooner or later crypto will humble you by its greatness. Its vastness is accompanied by a madness that is breathtaking, because you quickly realize that there is no stopping crypto from taking over the world. The moment you think you have everything figured out, is the moment the market will surprise you. We are for the first time living and witnessing the birth of the first worldwide free market. Throughout this rampage of innovation, we all are implicitly aiming for the best means of harnessing consensus. As we leave this bountiful 2017 and aim at 2018, it is important for us to meditate and appreciate the progress we have made in transforming the world through the decentralization of consensus. It is also important to reflect on the changes in consensus building we have partaken in and those yet to come. Consensus is the agreement that states “this is what has occurred, and this is what hasn’t happened.” Throughout the vastness of history, we humans have only really had access to centralized means for consensus building. In the centralized world, consensus has been determined by banks, states, and all kinds of central planners. As our readers know, any centralized party can misuse their power, and their consensus ruling can become unfair. In spite of this, many individuals still praise the effectiveness of consensus building of centralized systems. People from antiquity have had no other option but to trust these central planners. These systems of control have created still-water markets where only a few are allowed to compete. This lack of competition resulted in what we now can objectively view as slow innovation. For many, centralized consensus building is preferred under the pretense of security and comfort. Unfortunately, these same individuals are in for a whole lot of discomfort now that the world is innovating on top of the first decentralized consensus building technology, the blockchain. Everything that has occurred since the inception of bitcoin has shocked central planners because for the first time in history they are lost; they no longer hold power. We now vote with our money. We choose what we find best as different technologies compete for our money. What we are witnessing when we see the volatility in crypto is nothing more than natural human motion through price. The innovation and volatility of the crypto market may seem unorthodox to some, because it is. For the first time in history we are in a true free market. The true free market connects you to everybody and for this reason alone the market shouldn’t surprise us for feeling “crazy.” Volatility is a sign of your connection to a market that is alive. Radical innovation is a sign of a market that is in its infancy still discovering itself. In juxtaposing centralized consensus building with decentralized consensus building, I cannot keep myself from remembering some wise biblical words; “ And no one pours new wine into old wineskins. Otherwise, the new wine will burst the skins; the wine will run out and the wineskins will be ruined.” – Luke 5:37 The centralized legacy financial system is akin to old wineskins bursting to shreds by the new wine of crypto. Decentralized consensus building has no need for central planners. For example, think about how ludicrous it would be for someone to ask government for regulation after not liking something about crypto. Sorry, there is no central planner to protect you; even the mathematical protocols built for us to trust are now competing against one another for our money. These new mathematical protocols will keep competing against one another as they provide us with new options in decentralizing consensus. As we look unto 2018, it is important that we as investors begin to critically engage and analyze “blockchain-free cryptocurrencies.” HASHGRAPHS, TANGLES AND DAGS Blockchain-free cryptocurrencies are technologies composed of distributed databases that use different tools to achieve the same objectives as blockchains. The top contenders in the realm of blockchain-free cryptos are DAGs (Directed Acyclic Graphs) such as Swirlds’ Hashgraph, ByteBall’s DAG, and IOTA’s Tangle. These blockchain-free cryptos are also categorized as belonging to the 3 rd generation of cryptocurrencies. These technologies promise to be faster, cheaper, and more efficient than blockchain cryptocurrencies. Blockchains were the first means of creating decentralized consensus throughout the world. In the blockchain, the majority of 51% determine the consensus. The limits of blockchains stem from their inherent nature, whereupon every single node/participant needs to know all of the information that has occurred throughout the whole blockchain economy of a given coin. This opens up blockchains to issues akin to the ones we have been exposed to in regards to Bitcoin’s scaling. It is important to make a clear distinction in the language used between blockchains and blockchain-freecryptocurrencies. When we speak about blockchains it is more proper to speak about its transactionconsensus as “decentralized”, whereas with blockchain-free cryptocurrencies it is best if we refer to transaction consensus as “distributed.” Swirlds’ Hashgraph incorporates a radical and different approach to distributing consensus. Swirlds claims that their new approach will solve scaling and security issues found on blockchains. They use a protocol called “Gossip about Gossip.” Gossip refers to how computers communicate with one another in sending information. In comparison to the Blockchain, imagine that instead of all of the nodes receiving all of the transactions categorized in the past ten minutes, that only a few nodes shared their transaction history with other nodes near them. The Hashgraph team explains this as “calling any random node and telling that node everything you know that it does not know.” That is, in Hashgraph we would be gossiping about the information we are gossiping; i.e., sending to others throughout the network for consensus. Using this gossiped information builds the Hashgraph. Consensus is created by means of depending on the gossips/rumors that come to you and you pass along to other nodes. Hashgraph also has periodic rounds which review the circulating gossips/rumors. Hashgraph is capable of 250,000+ Transactions Per Second (TPS), compared to Bitcoin currently only allowing for 7 TPS. It is also 50,000 times faster than Bitcoin. There is no mention of a coin on their white paper. At this moment there is no Hashgraph ICO, beware of scams claiming that there is. There is however a growing interest in the project along with a surge of app development. IOTAs DAG is known as the Tangle. Contrary to Hashgraph, IOTA does have its own coin known as MIOTA, currently trading around the $3 mark. There are only 2,779,530,283 MIOTA in existence. The Tangle was also created to help alleviate the pains experienced with Blockchain scaling. IOTAs Tangle creates consensus on a regional level; basically neighbors looking at what other neighbors are doing. As the tangle of neighbors grows with more participants the security of the system increases, along with the speed of confirmation times. IOTA has currently been criticized for its still lengthy confirmation times and its current levels of centralization via their Coordinators. This centralization is due to the fact that at this moment in time the main team works as watchtower to oversee how Tangle network grows so that it does not suffer from attacks. Consensus is reached within IOTA by means of having each node confirm two transactions before that same node is able to send a given transaction. This leads to the mantra of “the more people use IOTA, the more transactions get referenced and confirmed.” This creates an environment where transactional scaling has no limits. IOTA has no transaction fees and upon reaching high adoption the transactions ought to be very fast. Another promising aspect about IOTA is that it has an integrated quantum-resistant algorithm, the Winternitz One-Time Signature Scheme, that would protect IOTA against an attack of future quantum computers. This without a doubt provides IOTA with much better protection against an adversary with a quantum computer when compared to Bitcoin. ByteBall is IOTA’s most direct competitor. They both possess the same transaction speed of 100+ TPS, they both have their own respective cryptocurrencies, and they both have transparent transactions. ByteBall’s token is the ByteBall Bytes (GBYTE), with a supply of 1,000,000; currently trading at around $700. ByteBall aims to service the market with tamper proof storage for all types of data. ByteBall’s DAG also provides an escrow like system called “conditional payments;” which allows for conditional clauses before settling transactions. Like IOTA, ByteBall is also designed to scale its transaction size to meet the needs of a global demand. ByteBall provides access to integrated bots for transactions which includes the capacity for prediction markets, P2P betting, P2P payments in chat, and P2P insurance. ByteBall’s initial coin distribution is still being awarded to BTC and Bytes holders according to the proportional amounts of BTC or Bytes that are held per wallet. IOTA, ByteBall and Hashgraph are technologies that provide us with more than enough reasons to be hopeful for 2018. In terms of the crypto market, you don’t learn it once. You have to relearn it every day because its development is so infant. If you are new to crypto and feel lost at all know that you are not alone. These technologies are constantly evolving with new competitive options in the market. As the technologies grow the ease for adoption is set to grow alongside innovation. We are all new to this world and we are all as much in shock of its ingenuity as the next newbie. Crypto is mesmerizing not just for its volatility which is a clear indication of how connected we are now to one another, but also because of the social revolution that it represents. We are experiencing the multidirectional growth of humanity via the free market. Meanwhile Bitcoin Is Turning Into Shitcoin It is with a great degree of sadness that I see bitcoin is on the cusp of destroying itself. Bitcoin Core, anyway. Bitcoin Cash may be the winner from all of this once all is said and done. Whether by design or by accident, bitcoin has become slow and expensive. Many people point out that IF the market were to upgrade to Segwit that all would be fine. I’ll explain further below why many market participants have no incentive to upgrade to Segwit… meaning that the implementation of Segwit has been a massively risky guess that so far has not worked. Others say that the Lightning Network (LN) will save bitcoin. I’ll point out below why that will not happen. Lightning Networks And The Future Of Bitcoin Core If you’ve been following bitcoin for any length of time, you’re probably aware of the significant dispute over how to scale the network. The basic problem is that although bitcoin could be used at one time to buy, say, a cup of coffee, the number of transactions being recorded on the network bid up the price per transaction so much that actually sending BTC cost more than the cup of coffee itself. Indeed, analysis showed that there were many Bitcoin addresses that had such small BTC holdings that the address itself couldn’t be used to transfer it to a different address. These are referred to as “unspendable addresses.” In the ensuing debate, the “big blockers” wanted to increase the size of each block in the chain in order to allow for greater transaction capacity. The “small blockers” wanted to reduce the size of each transaction using a technique called Segregated Witness (SegWit) and keep the blocks in the chain limited to 1MB. SegWit reduces the amount of data in each transaction by around 40-50%, resulting in an increased capacity from 7 transactions per second to perhaps 15. The software engineers who currently control the Bitcoin Core code repository have stated that what Bitcoin needs is “off-chain transactions.” To do this, they have created something called Lightning Networks (LN), based on an software invention called the “two-way peg.” Put simply, the two-way peg involves creating an escrow address in Bitcoin where each party puts some bitcoin into the account, and then outside the blockchain, they exchange hypothetical Bitcoin transactions that either of them can publish on Bitcoin’s blockchain in order to pull their current agreed-upon balance out of the escrow address. Most layman explanations of how this works describe the protocol as each party putting in an equal amount of Bitcoin into the escrow. If you and I want to start transacting off-chain, so we can have a fast, cheap payment system, we each put some Bitcoin in a multi-party address. I put in 1 BTC and you put in 1 BTC, and then we can exchange what are essentially cryptographic contracts that either of us can reveal on the bitcoin blockchain in order to exit our agreement and get our bitcoin funds. Fortunately, it turns out that the video’s examples don’t tell the whole story. It’s possible for the escrow account to be asymmetric. See:. That is, one party can put in 1 BTC, while the other party puts in, say, 0.0001 BTC. (Core developer and forthcoming Anarchapulco speaker Jimmy Song tells us that there are game theoretic reasons why you don’t want the counterparty to have ZERO stake.) Great! It makes sense for Starbucks to participate with their customers in Lightning Networks because when their customers open an LN channel (basically a gift card) with them for $100, they only have to put in $1 worth of Bitcoin. Each time the customer transacts on the Lightning Network, Starbucks gets an updated hypothetical transaction that they can use to cash out that gift card and collect their bitcoin. The elephant in the room is: transaction fees. In order to establish the escrow address and thereby open the LN channel, each party has to send some amount of bitcoin to the address. And in order to cash out and get the bitcoin settlement, one party also has to initiate a transaction on the bitcoin blockchain. And to even add funds to the channel, one party has to pay a transaction fee. Right now fees on the bitcoin blockchain vary widely and are extremely volatile. For a 1-hour confirmation transaction, the recommended fee from one wallet might be $12 US, while on another it’s $21 US. For a priority transaction of 10-20 minutes, it can range from $22-30 US. Transactions fees are based on the number of bytes in the transaction, so if both parties support SegWit (remember that?) then the fee comes down by 40-50%. So it’s between $6 and $10 US for a one hour transaction and between $11-15 for a 15 minute transaction. (SegWit transactions are prioritized by the network to some degree, so actual times may be faster) But no matter what, both the customer and the merchant have to spend $6 each to establish that they will have a relationship and either of them has to spend $6 in order to settle out and get their bitcoin. Further, if the customer wants to “top off” their virtual gift card, that transaction costs another $6. And because it adds an address to the merchant’s eventual settlement, their cost to get their Bitcoin goes up every time that happens, so now it might cost them $9 to get their bitcoin. Since these LN channels are essentially digital gift cards, I looked up what the cost is to retailers to sell acustomer a gift card. The merchant processor Square offers such gift cards on their retailer site. Their best price is $0.90 per card. So the best case is that Lightning Networks are 600% more expensive than physical gift cards to distribute, since the merchant has to put a transaction into the escrow address. Further, the customer is effectively buying the gift card for an additional $6, instead of just putting up the dollar amount that goes on the card. But it gets worse. If you get a gift card from Square, they process the payments on the card and periodically deposit cash into your bank account for a percentage fee. If you use the Lightning Network, you can only access your Bitcoin by cancelling the agreement with the customer. In other words, you have to invalidate their current gift card and force them to spend $6 on a new one! And it costs you $6 to collect your funds and another $6 to sell the new gift card! I’m sure many of you have worked in retail. And you can understand how this would be financially infeasible. The cost of acquiring a new customer, and the amount of value that customer would have to stake just to do business with that one merchant, would be enormous to make any financial sense. From time immemorial, when transaction costs rise, we see the creation of middlemen. Merchants who can’t afford to establish direct channels with their customers will have to turn to middlemen, who will open LN channels for them. Instead of directly backing and cashing out their digital gift cards, they will establish relationships with entities that consolidate transactions, much like Square or Visa would do today. Starbucks corporate or individual locations might spend a few USD on opening a payment channel with the middleman, and then once a month spend 6 USD to cash out their revenues in order to cover accounts payable. In the meantime, the middleman also has to offer the ability to open LN channels for consumers. This still happens at a fixed initial cost, much like the annual fee for a credit card in the US. They would continue to require minimum balances, and would offer access to a network of merchants, exactly like Visa and MasterCard today. This process requires a tremendous amount of capital because although the middleman does not have to stake Bitcoin in the consumer’s escrow account, he does have to stake it in the merchant’s account. In other words, if the Lightning Network middleman wants to do business with Starbucks to the tune of $100,000/month, he needs $100,000 of bitcoin to lock into an escrow address. And that has to happen for every merchant. Because every month (or so) the merchants have to cash out of their bitcoin to fiat in order to pay for their cost of goods and make payroll. Even if their vendors and employees are paid in bitcoin and they have LN channels open with them, someone somewhere will want to convert to fiat, and trigger a closing channel creating a cascading settlement effect that eventually arrives at the middleman. Oh, and it triggers lots of bitcoin transactions that cost lots of fees. Did I mention that each step in the channel is expecting a percentage of the value of the channel when it’s settled? This will come up again later. Again, if you’ve worked in the retail business, you should be able to see how infeasible this would be. You have to buy inventory and you have to sell it to customers and every part that makes the transaction more expensive is eating away at your margins. Further, if you’re the middleman and Starbucks closes out a channel with a $100,000 stake where they take $95,000 of the bitcoin, how do you re-open the channel? You need another $95,000 in capital. You have revenue, of course, from the consumer side of your business. Maybe you have 950 consumers that just finished off their $100 digital gift cards. So now you can cash them out to bitcoin for just $5700 in transaction fees, and lose 5.7% on the deal. In order to make money in that kind of scenario, you have to charge LN transaction fees. And because your loss is 5.7%, you need to charge in the range of 9% to settle Lightning Network transactions. Also, you just closed out 950 customers who now have to spend $5700 to become your customer again while you have to spend $5700 to re-acquire them as customers. So maybe you need to charge more like 12%. If you approached Starbucks and said “you can accept Bitcoin for your customers and we just need 12% of the transaction,” what are the odds that they would say yes? Even Visa only has the balls to suggest 3%, and they have thousands and thousands of times as many consumers as bitcoin. The entire mission of bitcoin was to be faster, cheaper and better than banks, while eliminating centralized control of the currency. If the currency part of Bitcoin is driven by “off-chain transactions” while bitcoin itself remains expensive and slow, then these off-chain transactions will become the territory of centralized parties who have access to enormous amounts of capital and can charge customers exorbitant rates. We know them today as banks. Even for banks, we have to consider what it means to tie up $100,000/month for a merchant account. That only makes sense if the exchange rate of bitcoin grows faster than the cost of retaining Bitcoin inventory. It costs nothing to store Bitcoin, but it costs a lot to acquire it. At the very least the $6 per transaction to buy it, plus the shift in its value against fiat that’s based on interest rates. As a result, it only makes sense to become a Lightning Network middleman if your store of value (bitcoin) appreciates at greater than the cost of acquiring it (interest rate of fiat.) And while interest rates are very low, that’s not a high bar to set. But to beat it, Bitcoin’s exchange rate to fiat has to outpace the best rate available to the middleman by a factor exceeding the opportunity cost of other uses of that capital. Whatever that rate is, for bitcoin, the only reason the exchange rate changes is new entry of capital into the “price” of bitcoin. For that to work, bitcoin’s “price” must continue to rise faster than the cost of capital for holding it. So far this has happened, but it’s a market gamble for it to continue. Since it happens because of new capital entering into the bitcoin network and thus increasing the market cap, this results in Bitcoin Core becoming the very thing that its detractors accuse it of: a Ponzi scheme. The cost of transacting in Bitcoin becomes derived from the cost of holding bitcoin and becomes derived from the cost of entering bitcoin. Every middleman has to place a bet on the direction of bitcoin in a given period. And in theory, if they think the trend is against Bitcoin, then they’ll cash out and shut down all the payment channels that they transact. If they bought bitcoin at $15,000, and they see it dropping to $13,000 — they’ll probably cash out their merchant channels and limit their risk of a further drop. The consumer side doesn’t matter so much because their exposure is only 1%, but the merchant side is where they had to stake everything. If you’re wondering why this information is not widely known, it’s because most bitcoin proponents don’t transact in bitcoin on a regular basis. They may be HODLing, but they aren’t doing business in bitcoin. Through Anarchapulco, TDV does frequent and substantial business in bitcoin, and we’ve paid fees over $150 in order to consolidate ticket sale transactions into single addresses that can be redeemed for fiat to purchase stage equipment for the conference. For Bitcoin to be successful at a merchant level via Lightning Networks, we will have to see blockchain transactions become dramatically cheaper. If they return to the sub-$1 range, we might have a chance with centralized middlemen, but only with a massive stabilization of volatility. If they return to $0.10, we might have a chance with direct channels. Otherwise, Lightning Networks can’t save bitcoin as a means of everyday transaction. And since that takes away its utility, it might very well take away the basis of its value and bitcoin could find itself truly being a tulip bubble. One final note: there are a some parties for whom all these transactions are dramatically cheaper. That is the cryptocurrency exchanges. Because they are the entry and exit points for bitcoin-to-fiat, they can eliminate a layer of transaction costs and thus offer much more competitive rates — as long as you keep your bitcoin in their vaults instead of securing it yourselves. Sending it out of their control lessens their competitive advantage against other means of storage. It comes as no surprise, then, that they are the least advanced in implementing the SegWit technology that would improve transaction costs and speed. If you buy bitcoin on Poloniex, it works better for them if it’s expensive for you to move that coin to your Trezor. In fact, an exchange offering Lightning Network channels to merchants could potentially do the following… 1) Stake bitcoins in channels with merchants. These coins may or may not be funds that are held by their customers. There is no way to know. 2) Offer customers “debit card” accounts for those merchants that are backed by the Lightning network 3) Establish middle addresses for the customer accounts and the merchant addresses on the Lightning Network. 4) Choose to ignore double-spends between the customer accounts and the merchant addresses, because they don’t actually have to stake the customer side. They can just pretend to since they control the customer’s keys. 5) Inflate their bitcoin holdings up to the stake from the merchants, since the customers will almost never cash out in practice. In other words, Lightning Networks allow exchanges a clear path to repeating Mtgox; lie to the consumer about their balance while keeping things clean with the merchant. In other words, establish a fractional reserve approach to bitcoin. So, to summarize, Bitcoin Core decided increasing the blocksize from 1mb to 2-8mb was “too risky” and decided to create Segwit instead which the market has not adopted. When asked when bitcoin will be faster and less expensive to transfer most Bitcoin Core adherents say the Lightning Network will fix the problems. But, as I’ve just shown, the LN makes no sense for merchants to use and will likely result in banks taking over LN nodes and making BTC similar to Visa and Mastercard but more expensive. And, will likely result in exchanges becoming like banks of today and having fractional reserve systems which makes bitcoin not much better than the banking system of today. Or, people can switch to Bitcoin Cash, which just increased the blocksize and has much faster transaction times at a fraction of the cost. I’ve begun to sell some of my bitcoin holdings because of what is going on. I’ve increased my Bitcoin Cash holdings and also increased my holdings of Dash, Monero, Litecoin and our latest recommendation, Zcash. Other News & Crypto Tidbits When bitcoin surpassed $17,600 in December it surpassed the total value of the IMF’s Special Drawing Rights (SDR) currency. Meanwhile, Alexei Kireyev of the IMF put out his working paper, “ The Macroeconomics of De-Cashing ,” where he advises abolishing cash without having the public aware of the process. Countries such as Russia are considering creating a cryptocurrency backed by oil to get around the US dollar and the US dollar banking system. Venezuela is as well although we highly doubt it will be structured properly or function well given the communist government’s track record of destroying two fiat currencies in the last decade. To say that the US dollar is being attacked on every level is not an understatement. Cryptocurrencies threaten the entire monetary and financial system while oil producing countries look to move away from the US dollar to their own oil backed cryptocurrency. And all this as bitcoin surpassed the value of the IMF’s SDR in December and in 2017 the US dollar had its largest drop versus other currencies since 2003. And cryptocurrency exchanges have begun to surpass even the NASDAQ and NYSE in terms of revenue. Bittrex, as one example, had $3 billion in volume on just one day in December. At a 0.5% fee per trade that equaled $15m in revenue in just one day. If that were to continue for 365 days it would mean $5.4 billion in annual revenue which is more than the NASDAQ or NYSE made this year. Conclusion I never would have guessed how high the cryptocurrencies went this year. My price target for bitcoin in 2017 was $3,500! That was made in late 2016 when bitcoin was near $700 and many people said I was crazy. Things are speeding up much faster than even I could have imagined. And it is much more than just making money. These technologies, like cryptocurrencies, blockchains and beyond connect us in a more profound way than Facebook would ever be able to. We are now beginning to be connected in ways we never even thought of; and to some degree still do not understand. These connections within this completely free market are deep and meaningful. This is sincerely beautiful because we are constantly presented with an ever growing buffet of competing protocols selling us their best efforts in providing harmony within the world. What all of these decentralized and distributed consensus building technologies have in common is that they connect us to the world and to each other. Where we are going we don’t need foolish and trite Facebook’s emojis. As we close a successful 2017 we look with optimism towards a much more prosperous 2018. The Powers That Shouldn’t Be (TPTSB) can’t stop us. As we move forward note how much crypto will teach you about ourselves and the world. In a radical free market making our own bets will continue to be a process of self discovery. Crypto will show us the contours of our fears, the contours of our greed, and will constantly challenge us to do our best with the knowledge we have. Remember, randomness and innovation are proper to the happenstance nature of a true digital free market. Happy New Year fellow freedom lovers! And, as always, thank you for subscribing! Jeff Berwick
Blockchain non-profit IOTA has released the version 0.1.0 of its Coordicide Alphanet ahead of its planned launch in Q2, on IOTA’s Rust-based node (Bee).The early release of this version is part of the company’s plan to move the network to a coordinator-free IOTA mainnet, according to a blog post on February 3, 2020. IOTA releases Pollen, the first phase to the IOTA 2.0 decentralized network which is expected to arrive in the first half of 2021, so let’s find out more in the IOTA news crypto.. IOTA continues its path to the so-called Coordicide which is the removal of the Coordinator, a node run by the IOTA Foundation for the network protection and transaction confirmation. The coordinator does not gain knowledge that Wormhole is used. Protocol. Alice A [with tor identity A1 and A2] has a 5.5 bitcoin UTXO; A sends 1 bitcoin to Bob B [with tor identity B1 and B2] Wasabi server W coordinates the zero link coin join:-- Equal value denominations are 1, 2, 4, 8, 16, 32 bitcoin-- Anonymity set for each denomination is 100-- Wormhole protocol is opt-in for some unknown ... DAG is highly scalable, enables quick transaction confirmations, and reduces transaction fees. However, a zero transaction fee policy creates system security risks by exposing the network to DoS attacks. Anyone can send a large number of transactions which reduces the efficiency of the entire network, affecting the security and stability of the Internet of Things. Other notable technical ... Wasabi Wallet developers have been working on the platform’s CoinJoin feature and they are looking to release a new upgrade that would further enhance the transaction mixing functionality of the wallet, according to reports on August 27, 2020.. Wasabi to Launch WabiSabi Wasabi, an open-source, privacy-focused, non-custodial Bitcoin wallet that implements trustless CoinJoins over the Tor ...
Secure 0 conf for instant transactions on Bitcoin Cash
In this video we use blockexplorer.com and blockchain.info to check our Bitcoin transactions, learning important information and details about our crypto transactions. Reach us through: - Hodl ... In this video we talk about what 0 confirmation actually means, various ways someone can perform a double spend, and why merchants should set several confirmation payments. *Time Stamps* 7:06 ... https://www.moneybutton.com * importance of zero conf * https://www.yours.org/content/0-conf-matters---34c96ac9b670 * why zero conf works on bitcoin cash * a... Bitcoin Fake Transaction is a software that allows you to send bitcoin fakes.the software actually performs fake Bitcoin transactions. The bitcoins sent by this software are not real bitcoins. the ... ZDAG (Zero confirmation Directed Acyclic Graph) is a high speed transaction layer on top of the Syscoin Blockchain. ZDAG transactions go through the same validation and consensus as regular POW ...