The Flash Crash occurred at roughly 2:45 PM on May 6th, 2010. The New York Stock Exchange plunged hundreds of points, and within minutes had recovered most of its value. This was and is continually referenced by people who believe that HFT is in some way damaging the markets-at-large, that without algorithms and computers the value would never have dropped so quickly, and that it represents all that is wrong in modern trading. I have a different perspective. I spent about eight years in the HFT industry, and am intimately familiar with all of the nitty-gritty details of modern exchange technology, market making algorithms, low latency topologies, and system level design and architecture. That makes me somewhat of an insider, and not unbiased. I was sitting feet away from a trading desk when the flash crash occurred, I’m writing this to set the record straight.

Before I do, lets put some ground rules in place. There are some simple concepts that need to be understood in order to talk about modern trading systems. I’m going to run through them briefly so that the astute reader has the necessary context. I’ll try not to be too technical, and I’m going to gloss over some of the long-winded details, brevity is after all the soul of wit.

  • First, the value of a stock (and anything really) is determined by what someone is willing to pay for it at any given time in any given market. There is no intrinsic worth. Right now Google (GOOG) is trading at $597.11 on NASDAQ, because someone was willing to purchase a share for $597.11. This seems simple enough, but is crucial to understanding what happened on May 6th. `
  • Second, most modern markets are built around a two sided order book. An order book lists prices and quantities that people are willing to sell at, and prices and quantities that people are willing to buy at. At any given time there may be a significant number of orders that are below or above the current price in the market. In other words right now there are 100 shares available at $597.11, 200 shares at $599, etc etc. The order book has depth, which is not immediately obvious.

  • Third, nobody is forced to buy or sell anything at a price they didn’t agree to (with some esoteric and irrelevant exceptions). Bids that are placed into the market, regardless of their duration are risking capital in order to participate. I have no obligation to buy from you at any price, and certainly can’t be forced to sell to you at a price I didn’t agree on.

  • Fourth, market makers are generally firms that are willing to buy and sell at a given price, and are willing to provide liquidity to any given market on a consistent basis. Many of these firms get reductions in fees based on volume to incentivize trading.

  • Finally, liquidity is the availability of a given instrument. A highly liquid market means that there are active buyers and seller, and depth-of-book that can cover large orders. Price discovery is the process by which the best price is found.

With that out of the way, lets dive into some of the structural components of modern trading.

There isn’t a single market. Shares of any given equity may trade on many different markets, NYSE, NASDAQ, Bats, NSX, Arca, Alternext, etc. The system is distributed, such that it is possible to execute orders across multiple markets simultaneously. Arbitrage is the process of making the difference between two separate markets. For example, if two different exchanges list the same equity at different prices one could make money by buying in the first and selling in the second, thereby capturing the spread (difference in pricing).

The reason that prices are often identical across markets is because of arbitrage pressure causing price convergence. The speed at which this convergence occurs is a race between participants looking to make the spread. Many High Frequency Trading companies play this game all day long between markets all over the world. It is a simple idea, that is technically difficult to execute on and highly competitive.

Markets often are connected together and in certain circumstances can route orders off market in order to satisfy them. However participants have an incentive to connect to multiple markets in order to increase available liquidity, match large orders across markets, and get best-price execution. In other words the more data you can ingest and the faster you can do it the better pricing you can theoretically get. When you place an order on a commercial service they generally show you the best price available anywhere based on the information they have.

The flow of information across markets is extraordinarily fast. Geographic distances at the speed of light in fiber or microwave propagation are usually in the milliseconds. Modern platforms can match and execute orders internally in the microseconds (and in certain cases in the nanoseconds!). The faster this process happens, the quicker we have price convergence across geographic areas. A human being might not be able to decide in a microsecond whether he wants to buy something, but he can tell a computer that he wanted the lowest available price. Modern trading offloads intent into computers because fluctuations in time and the complexity of a distributed exchange system makes entirely human trading completely obsolete in many markets with public price discovery.

Alright, that was a lot to digest. The implications of algorithmic trading are pretty widespread, and there is an argument to be made on both sides as to whether or not it is necessary. It is important to note that major exchanges make their money by charging transactional fees per trade, and that the majority of the volume in any given market is being driven by algorithmic trading systems. So on the one hand we have public exchanges that must follow the rules and regulations, but are benefitting from the way the system is currently constructed because massive volume directly affects their bottom lines. I believe that market competition is a good thing, and that arbitrary rules and regulations have a detrimental affect on the industry. In the 1980s you could spend a significant amount of money in brokerage fees, as well as get pretty terrible pricing on top of that because of lack of transparent price discovery. It is an unpopular opinion amongst those that think that HFT is robbing them silently in the night, but I believe that $4.95 trades on E-Trade are directly possible because of how computers have changed the trading landscape, increased competition and ultimately passed massive savings on to consumers. I could write an entire article about how HFT is actually helping the average person and is completely misunderstood, but on to the Flash Crash!

Now that we have some of the necessary background, let’s talk about that May day. In the official SEC-CFTC report, the Chicago Mercantile Exchange (CME) is fingered as the place the trouble started. Allegedly large future trades triggered instability in the underlying. Futures are derivative contracts that allow speculation on the future price of an underlying asset. The E-Mini tracks the S&P 500 Index price in the future. It can be constructed from a basket of stocks representing the S&P 500, or various collections of other contracts. A simple HFT strategy would be to trade that basket against the future value, back and forth all day long. So conceivably a very large trade on the E-Mini could cause various firms to rush to the equities markets to cover their positions. Given the size of the trading involved, I think it is highly unlikely that this happened, or at least that the affect was widespread enough to cause a systemic failure in the system. The CME’s counter-argument is compelling in this regard, but is too technical to discuss here.

Regardless of how it started, sell pressure increased on the NYSE as firms started unwinding positions which began to drive the price down. This is a normal occurrence, but what happened next is anything but normal. NYSE hiccuped. The increased amount of volume started queuing in their systems resulting in large delays in execution. In other words the time from placing an order until it executed started to spiral upwards quickly as their systems failed under the load. Now, if you remember from above the tolerances of many of these HFT systems is tight. Most firms began cancelling their orders from the market, or physically disconnecting to take advantage of market-supplied cancel-on-disconnect functionality because they assumed there was a systems fault in progress and didn’t want to get caught with orders that would execute in a market they couldn’t trade in. Liquidity was routed to other providers that were known to be good, and NYSE was dropped by HFT firms providing liquidity because it was clear they weren’t functioning properly.

When all of the orders in a market disappear the book starts to become spectacularly thin. This means that people sitting around way off the market, with test orders or just generally hanging out could end up having a trade go through at an outrageous price. This is exactly what happened, people with market orders (taking the best available price in the market) were suddenly buying and selling way off market because all of the sophisticated trading firms had taken their liquidity elsewhere while NYSE was getting their act together. This resulted in a 600 point drop as multiple stocks thinned out.

Tellingly, minutes later when NYSE restored normal operation the pricing rapidly normalized as trading firms that were locked out of the market reentered once they had determined that systems were functioning normally. In other words, the combined logic of the various HFT firms resulted in routing around systemic failure, and immediately reinforcing the market once it had been restored. This is exactly the functionality one wants to see in a resilient distributed market.

Let’s change the situation. There is no way for an HFT firm to determine what was going on at NYSE instantly. It could have been the firm’s fault, a faulty computer, a broken cable, a sliced fiber. The exchange itself could have exploded due to a nuclear attack, an earthquake, a tidal wave or another act of God. Yet, despite being incommunicado trading continued normally at other exchanges, and full service was restored within minutes at the NYSE. That is called good engineering.

When things go wrong, everybody looks for a scape goat. HFT was a good one. Banks are uncomfortable with firms eating their tasty margins (look at the recent IEX as an example of this). NYSE doesn’t want to admit fault of any kind, and HFT firms are hard pressed to explain what they do, or how it positively affects other participants in the market. It’s easy to point fingers. Nanex made some incredibly flawed claims based on data that cannot possibly support their conclusions, most of it aggregated tick-data which doesn’t reveal individual firm intent, but anti-HFT advocates jumped at the chance of having concrete proof that their worse fears were realized. The reality is always a bit more nuanced.

This may not have convinced you, it is only one persons account, but at the very least you should consider the possibility that it is correct, and that any preconceived notions you have about the affects HFT has on the market may not be. These are complex and dynamic systems, and they can interact in unexpected ways, but in this case they did the right thing.

The grandaddy of the alternative currency movement, Litecoin has the second largest market capitalization at $113 million dollars. Originally forked from the Bitcoin code base and created by Charlie Lee in October of 2011, Litecoin was an experiment that tinkered with some of the network’s fundamental parameters. The confirmation time was reduced to 2.5 minutes, the maximum number of coins was increased, and most fundamentally the mining algorithm was changed to Scrypt.

Scrypt is a memory-hard algorithm, it is constrained more by total available memory than by the processor speed. Bitcoin in contrast uses SHA-256, which is limited largely by processing throughput, is easily parallelized, and was quickly commoditized by inexpensive ASICs that dramatically increased the hashing power of the network. The mining algorithm is an essential ingredient in how most blockchain-based decentralized networks work. By tinkering with the formula Litecoin created the first ASIC resistant blockchain. In other words, the price to produce dedicated equipment for minting Litecoin was prohibitively expensive in the early days which prevented the network from being taken over by people doing dedicated Bitcoin mining. If that made sense you are probably already part of the crypto-currency faithful, but in short this change allowed the network to develop organically without being subverted too quickly by existing crypto currency miners.

The end result is that Litecoin saw fairly spectacular growth in its own right, check out the chart at Coinmarketcap. A simple but important change was made to the core platform that helped spawn an additional, parallel and relatively secure trust network. While not quite as large or fortified as Bitcoin, it does represent a viable alternative despite the shared code base. There has been continued development since that time, but there has also been divergence. The Bitcoin and Litecoin code looks fairly different at this point, driven by differing levels of effort and design decisions made by their respective core contributors.

Litecoin can definitely be classified as a legitimate attempt to push forward digital currency research, and made enough changes for it to be interesting in its own right. Its long term potential remains very questionable though. There are a lot of Scrypt derivative coins at this point, but we are already starting to see very sophisticated Scrypt ASICS enter the market capable of significant increases in hashing power. This will probably damage other newer entrants into the market, because they will not be able to secure their network until they hit critical mass. Good news for Litecoin because it has already established a dominant position. If the value is ultimately in the network and the community supporting it, then Litecoin has a good future, but competitors that link directly into Bitcoin might undermine its main value proposition and are lurking on the horizon.

I’m holding my Litecoin position, but not acquiring. I consider it a hedge against certain classes of problems with the Bitcoin network, but ultimately believe that barring some major issue Bitcoin will continue to be the dominant player.

What are Altcoins? Why do they matter, and should we pay attention? Should I invest in them? Are they damaging the credibility of digital currency? According to Coinmarketcap there are 454 alternative digital currencies with non-negligible market capitalization. It is probably safe to say that there are thousands in total, some abandoned, some fledgling, and 95% or more of them are completely and utterly useless. All together they have a combined value of over 329 Million USD. A lot of money to be sure, but to put that in perspective the Bitcoin market cap is 20 times the size of all of the others combined. Why is that?

Altcoins are largely copies of the Bitcoin code where the initial parameters have been modified to create another Blockchain. In other words a Bitcoin duplicate or doppelganger, a fresh coat of marketing paint, and a fancy name, but fundamentally identical in function. However, just because something functions the same, doesn’t mean that they are equal. The real value in Bitcoin and largely what accounts for the massive discrepancy in capitalization is in the security of the network. Hundreds of millions of dollars has poured into creating a large, distributed, and fortified network that at its heart it capable of transferring data securely between untrusted parties pseudo-anonymously. This fundamental utility is unmatched by most of the alternative coins. It is this security that dramatically amplifies the total value of the network. Any alternative coin that claims to be as secure as Bitcoin would need to have massive buy-in from the miners and individuals that make up the Bitcoin network.

If the network can’t be matched, what is the point? The real value of Altcoins is in trying out new innovative ideas, exploring the boundaries of what is possible with digital currency, and validating different approaches. Digital currency and trust networks are an entirely new way of facilitating payments online, and like any new technology there is a lot of work to be done to build robust next generation networks. Being able to quickly copy the Bitcoin code and try new ideas is fundamental to advancing the state of the art. Unfortunately most Alt coins are created solely to profit off of unsophisticated investors, who lack the technical competence or independent third party resources to validate the claims being made. Many who sat on the side lines as Bitcoin made a meteoric ascent are wondering if they missed the boat, and whether investing in some of these alternative currencies might make sense.

The Altcoin has become a penny stock on steroids, except without any of the safety nets meant to shield consumers from outright fraud. A subject I’ve written about in the past with respect to zetacoin Completely unregulated digital assets, backed by very questionable assertions, and a lot of marketing spin. Very few of them have any intrinsic value, or have made substantial contributions to digital currency research. This shouldn’t be construed as investment advice, but investors should realize that Altcoins are extremely speculative assets, are easily manipulated by the creators or third parties, and the vast majority have absolutely no value whatsoever. One shouldn’t invest monies that they can’t afford to lose in technologies they don’t understand.

One of the questionable claims that many Altcoins make is about confirmation times. This has got to be the most misleading metric that keeps on being thrown around. Bitcoin is infamous for their choice of 10 minute confirmation times. That means on average it takes 10 minutes for a transaction to be validated by the network, and up to an hour for a transaction to be indelibly inked in the global ledger. These are simple parameters of the network, that can be changed in just a few lines of code. For arguments sake lets say I do that, and release my new coin SuperFastBest coin. SuperFastBest coin has confirmation times of 1 second, by my calculations it is 600 times faster than Bitcoin and thus must be worth at least 600 times more. If it was this simple to increase the speed of Bitcoin why didn’t Satoshi make confirmation times 1 second? For a project of such immense complexity it seems rather short sighted to make transactions so slow.

The answer is that decreasing confirmation times increases the number of orphaned blocks. In other words the network does not have time to reach uniform consensus and so alternate chains begin forming in different parts of the world. The end result is substantially decreased integrity and security. SuperFastBest coin might be validated in only a second, but it will take hundreds of seconds for the network to converge to the point where I can trust that the transaction actually happened. Dogecoin, and others with shorter confirmation times require more processing power and memory utilization and ultimately are not much faster or better than Bitcoin is already. Coins that market themselves based on transaction times should be distrusted, either the developers are incompetent, or are purposely misleading you.

In short, when people make extraorindary statements about their new crypto currency, be ready to ask for proof. Get feedback from people in the community who have the technical expertise to validate those claims, and be extraordinarily wary of any altcoin that can’t back up its claims with reasoned argument. Bubbles in the market are largely based on hype without fact, and that only serves to detract from the legitimate work going on in the Bitcoin space. I’m going to devote some more time on this blog talking specifically about leaders in market capitalization, and what makes them interesting or scammy. If you have a particular favorite you would like me to look into send me a tweet!

I remember the Internet in 1991. I had a Panix account, a venerable and awesome Internet Service Provider in the New York City area. I would fire up my 1200 baud modem and connect to an incredible world dominated by Telnet, Gopher, Wais, and Usenet. I stumbled upon LambaMoo and was blown away that a community of people were actively speaking and engaging with each other over this nascent Internet. It was fascinating, weird, felt a little bit dangerous and seedy, like a secret society. The technology was interesting, with tremendous potential, but had yet to be commercially exploited, and frankly nobody outside academia or the military really understood it.

Then the modern web was born. October 13, 1994 was the day Netscape was released, and it completely changed the way people used the Internet. Sure, the web had been developed several years before, and early web browsers existed already like NCSA Mosaic, but Netscape changed the game. Suddenly, it wasn’t difficult to use the Internet. A few clicks and you were on line. A few more clicks and you were spiraling down a never ending series of pages loosely strung together all over the net. There was an addictive quality to the experience despite the fact that by modern standards it looked absolutely terrible. Browsing those early pages now is an adventure in frustration, but by the standards of the day it was transcendent.

That was a pivotal moment in the story of the Internet. A weird government science project-turned commercial. A strange and forbidding world became accessible. Almost three billion people use the network now. It is deeply ingrained in the working and personal lives of individuals the world over. It is almost hard to imagine not having the Internet. It is a surrogate teacher, a way of staying instantly connected to everyone and everything that is important in our every day lives. It has uprooted and completely supplanted entire industries, and made obsolete ways of doing business that were deeply entrenched. It has changed humanity in ways that we are still trying to understand, and the pace of that change is accelerating dramatically.

Then there was Bitcoin. For the first time in human history two people can exchange value over distance, without knowing each other, and without trusting either party or intermediaries. For the first time I can pass information to someone else over an insecure network and expect them to get it. I can facilitate transactions that cannot be subverted or duplicated. I can refer to a ledger that cannot be altered, that records that this transaction took place, and I can leverage computers to programatically create and manage these transactions. Bitcoin is difficult. It is hard to understand. It is hard to use. It is confusing to normal people. It is confusing to abnormal people. These same people were confused in 1994 when Netscape came out, because they lacked vision, they lacked imagination.

Bitcoin hasn’t had its Netscape moment. It took almost twenty five years for the Internet to have its time to shine, and Bitcoin is only five years old. What a five years it has been. A technology has literally been dropped into society that could fundamentally reshape the way business is done. Think of Bitcoin as like an IP address for people. A way of transferring value and receiving money from every single person on the planet, directly. The traditional banking sector can’t even begin to provide services to billions of people who live on just dollars a day because of the friction involved in moving money across and between countries. Now the friction is gone.

But its more than that. In fact, describing Bitcoin as a way of transferring money is kind of like saying the Internet is a way to send E-Mail. It completely misses the point, it lacks imagination. Bitcoin has a market capitalization of almost seven billion dollars at the time of this writing, but more importantly represents hundreds of millions of dollars in commercial investment meant to sustain the network. Quietly a new network has been forming, a network that adds Trust to the Internet. Bitcoin is an all-access pass to trusted communications and all of the services that entails. Why stop at transferring value? Why can’t we move contracts between people or organizations? Why can’t we have intelligent agents that manage our day to day finances? How can we codify the rules of industry to decrease cost, to automate the tedium of life, to make things more convenient, faster, better.

So we have not only a frictionless mechanism for delivering value, but the machinery to build much greater things. The point is that we haven’t even begun to see the types of technology we can build on this network, and that the applications we do have are more akin to Telnet than the WWW. In the early nineties it was difficult to envision what the Internet would become. It required both prescience and arrogance to believe that it would penetrate the fabric of society as deeply as it has. At the risk of being both, trust networks built on digital currency are going to change our world just as fundamentally. We are going to bake them into the protocols that underly modern commerce. We are going to integrate them into human labor and automated agents. We are going to build something that has never existed before and will benefit us all in ways that we cannot possibly begin to imagine.

Minting currency is hard. All of that smelting, printing, special ink, security thread, optical illusions and micro print. Yet, after all of that work, hundreds of millions of fake bills find their way into international commerce, diminishing the value of the dollar and defrauding merchants worldwide. The temptation and reward is simply too great, and in many cases nation-state actors are engaging in a form of economic warfare. This is a technological arms race on a vast scale, between criminal syndicates, rogue countries, and nations protecting their capital controls.

You can’t counterfeit Bitcoin. A fact that people speak of rarely, but a foundational component of the system, and absolutely essential to its continuing success. The same strong cryptography that allows for effortless transfer of value gives Bitcoin an enviable level of digital protection. A virtual asset that cannot be duplicated, altered, copied or printed.

Even better, the same properties can be extended to protect all sorts of digital assets printed into the blockchain. A verifiable fort knox of digital goods, and tokens that represent everything from cars to property, derivatives and stocks. The ability to publish these items onto the distributed ledger and the software and systems to manage it is critically important to the next wave of contract and digital currency innovation.

The idea of private currency has been advocated for years, most prominently by economist Friedrich Hayek, who won the Nobel prize in 1974 for his work on the theory of money. His work on the subject implies that local communities can drive spending and reach shared goals by minting locally used currency and using it exclusively for trade as an adjusted unit of barter. That competing Fiat instruments would of their own accord reach an equilibrium in the free market and would be a natural alternative to state control.

These same concepts map nicely onto decentralized communities. The forums and private groups that characterize such large portions of the net are perfect places to adopt this new variation of the movement. With the underlying technology freely accessible and decentralized. Within Bitcoin we are already starting to see this type of fragmentation with well over a thousand alternative currencies blossoming out of nothing. Of course, the vast majority of those currencies have little utility. They neither provide a secure network, nor are backed by any company or means of production. They exist for the most part as purely speculative instruments, designed to take the money of the unsuspecting and greedy who would invest. Penny stocks on steroids.

Now the game has changed. With little work a group can generate incorruptible, and impossible to counterfeit tokens on top of the Bitcoin network and use them for trade, in exchange for perks, or simply as a matter of reputation. These can be instantly disseminated to participants, and provides a powerful and additional mechanism for user engagement, as well as unlocking easy trust-less barter in distributed communities.

We are just scratching the surface of the types of applications possible. Real smart contracts and distributed verifiable tokens are the operational and technical bedrock of a myriad of new financial services and products. If Bitcoin is to financial inter-operation what TCP/IP was to the Internet we can expect a truly stunning wave of innovation ahead.

Ripple recently announced its vision for a smart contract infrastructure called Codius. You can check put their white paper here, although it is long on verbiage and short on details. The idea is simple though, and much more in line with a reasonably attainable vision of a smart contract infrastructure that leverages our existing investments in systems, tools and architecture.

This is in sharp contrast to Ethereum, a design I find puzzling in that it tries to solve decentralized computation in the blockchain model. I applaud their ambition, but naivete and optimism go a long way, execution is what matters here. Op codes exist in Bitcoin but many have been disabled due to security concerns and complexity, and it doesn’t begin to strive for Ethereums Turing complete operation. I could write an entire post on Ethereum, frankly I’ve been putting it off while we wait for an initial prototype, but given the amount of money that has been raised it might be time for a more critical assessment.

The idea of leveraging the blockchain for trusted communication while hanging sandboxed autonomous agents off of it is much cleaner. I’ve spoken of using proof of existence in distributed voting as a mechanism to hash source code into the blockchain and independently be able to verify all execution. Combining this approach with a sandboxed execution environment and a strict set of data acquisition APIs would provide a powerful framework for smart contract execution and validation.

We already use distributed code to run Bitcoin, codified by a set of rules that all clients agree on to form a consensus network. If we follow this to the logical conclusion why can’t ad-hoc nodes run modules and have shared multi-signature access to resources on the network? Wouldn’t that provide a framework on which to build a robust smart contract infrastructure?

I think Codius is on the right conceptual track, and I’m definitely looking forward to seeing how the community evolves. Autonomous agents with access to the blockchain will usher in a new wave of functionality. Think about the implications of being able to assign an agent to your wallet that can act on your behalf and is hosted externally in a trusted execution environment and driven by the same consensus mechanism as the network itself. Further by creating a language agnostic implementation within that constrained environment we can leverage the vast expertise in regular software development, and not worry about the complexities of long running DACs. I can see a model where companies pay for execution and maintenance of hosted smart contracts, certainly the overhead of outsourcing and compliance internally would make this an attractive option.

The only question in my mind is why it has taken so long, it’s time to usher in an entirely new wave of contract innovation. It seems we are tantalizingly close to having not only decentralized autonomous software, but external hosted agents, oracles and contract infrastructure.

I’m from New York. I was born and raised in Manhattan, and spent over thirty years there. I want New York to succeed, and I want to see it expand available opportunities for its millions of residents. Almost 50% of people in the greater NYC area are below the poverty line. We have some of the greatest income inequality in the world, and a vast burdened system of entitlements barely keeping the peace.

You see, New York has a problem. Government. Sprawling Byzantine regulation is choking the city, as stifling as any pollution, gumming up industry and robbing the city of innovation. If you manage to navigate this artificial maze, you are further squashed by crippling taxes.

Like the legendary Sisyphus, tasked with rolling a boulder up a hill that can never be summited, new business is forced to wage an uphill battle. In a state that desperately needs more jobs, entrepreneurial activity brings wealth. We can’t have that.

The “Bit License” is a proposed law strictly regulating virtual currency companies, and is almost guaranteed to rob New York of billions of dollars in aggregate development. Spearheaded by bureaucrats, dreamt up by committee, and stinking of protectionism it’s fetid rotten core will deny New Yorkers the economic development they desperately need, while simultaneously reducing taxable revenue for the state and city.

Requirements such as having a full time CISO (chief information security officer) will simply increase the cost of doing business, and will make the city an unattractive place for fledgling technology companies. Similar bonding for held virtual currency and strict identity requirements beyond federal requisites will further stifle innovation. When did the empire state become the nanny state?

This is doubly a shame given that NY has so many unique attributes, and with even the tiniest degree of foresight could make significant inroads in digital currency. A solid educated workforce in the payments and financial industry. Direct access to the world’s most liquid capital markets. Close proximity to leading banks and financiers. A small but growing venture capital scene, and a smart tech community in the same vein as the late nineties silicon alley.

New York is often ranked with London as a if not the financial capital of the world. A fact that has made it complacent, lazy and greedy. New aggressive economies like Singapore and Hong Kong are poised to take that crown if we unduly burden ourselves with unnecessary and poorly thought out regulation. I’m not opposed to regulation per se, but I think the knee jerk reaction to these new technologies will only cement NY as an also ran in what promises to be a disruptive race. What if Netscape had been outlawed in the nineties? How much wealth has the Internet brought the United States?

I have an idea, how about we distill this down to a single line, “New York welcomes all digital currency businesses, and will provide tax incentives to get them to start here and help lift our citizenry from crippling and unnecessary poverty.” Oh, who am I kidding? Let’s ban soda while we’re at it.

Awesome SenderDefender logo

It’s a bird, it’s a plane, no, it’s SenderDefender for Gmail! Hi Everyone, I’m proud to announce the SenderDefender for Gmail beta! Time to unlock file transfer super powers. :) It has been a long time coming, but I’ve rewritten a substantial amount of the underlying technology and distilled it into a Chrome plugin. That means very large file transfer, ephemeral messaging with delete-on-receipt, and end-to-end user encryption. One-click and drag-and-drop support for up to 100 Gigabytes PER FILE, encryption designed to protect your data from a host of threats, and automatic expiry of sensitive information to prevent hacking and fraud in the event your email account is compromised.

Other features included drag-and-drop directory zipping and transfer, which not only reduces bandwidth and transfer time but makes it super easy to send folders directly from Gmail. The best part is that you can send files anywhere, recipients do not need Gmail, and can still receive documents quickly and securely in their web browsers. Structurally that is a major difference between SenderDefender and many of the file transfer options available, no sign-ups, no logins, no advertising, just secure hassle-free data delivery.

The application also supports a variable and adjustable threading system so that you can optimize transfer time depending on the amount of bandwidth, memory, and processing resources you have. You can modify those settings by clicking the SenderDefender drop down button. Working resources also automatically scale down to reduce power consumption and extend battery life on laptops.

You can find the secret chrome store beta link HERE, that will allow you to install directly. Please feel free to spread it around, or send it to friends, family, and people you think will use it. Every subscription tier includes a generous free trial period, please try it out and let me know what you think, any suggestions or bug reports should be sent to support@coinlock.com. Just click the SD icon in the compose window to get started, or drag a larger file or directory right into the compose frame. It really is that easy, compression, encryption, and transfer wrapped into an extremely tight package. People without the plugin get an easy to follow link, and plugin users get a really seamless download experience. You can continue using gmail or other web sites while your files downloads in the background. If you love it write me an awesome review :)

Finally, I recently wrote about Amazon’s Zocalo and file transfer services in general. SenderDefender significantly raises the bar on how that happens and protects user data from certain classes of attacks that most commercial services don’t even address. Many claim to support encryption in their products, but it comes down to whether they ultimately have access to your data. Nobody at SenderDefender can read your data, not the service provider, not the network carrier, no one. This isn’t true for Zocalo, Dropbox, Box, CitrixSharefile, and a host of other competing services. If your data matters or you have big transfer needs, use SenderDefender and help support the development of software and systems that retain your data privacy.
` I’ve got a host of really cool features planned in future versions, if you have any ideas just drop me an email and lets talk about it. Follow me on this blog or @brantonbits on twitter to keep up with development or let me know what you think!

Amazon has announced its new storage service Zocalo aimed right at the enterprise market. This is notable because Amazon has served as the backbone of the file storage industry for the last five years. Simple APIs coupled with massive storage capabilities and extremely competitive pricing resulted in Amazon becoming a central pillar in the design of many cloud based and hybrid storage systems. This is a subject close to my heart because I’ve spent significant amounts of time mucking around in their ecosystem and have my own storage product built on top of it called SenderDefender. When I first saw the article the first thought I had was, “great, more competition” but in reality what we have is more of the same.

Companies that leverage systems like Dropbox, Box, or now Zocalo are making a mistake. They are ceding complete control of their data to a third party, and in exchange for inexpensive and accessible storage are opening themselves up to a host of insider threats. A few months ago I gave a talk on insider threats in a military setting, and there is a common misconception that the term only applies to the employees within an organization. The reality is that it applies to the employees within every organization that you interface with. When you leverage services like Amazon’s Zocalo offering you are dramatically increasing the attack surface of your company.

The attack surface is a nebulous security phrase, but it boils down to the idea that the more systems, components, and people who have access to your internal data the less secure you ultimately are. Now, in addition to worrying about the disgruntled former employee, you need to worry about Amazon’s disgruntled employees, and Dropbox’s, etc. There is an amplification effect at play that results in a greater chance for corporate espionage and data theft. These are very real issues, that target American corporations and individuals in the form of identity theft and fraud every single day. People are all too willing to exchange convenience and cost for security. The weakest link is always the person with the most access, and the most to gain from a security breach.

Now as a small business owner or individual you may think this doesn’t apply to you, that you aren’t a target, that nobody is interested. The reality is quite different. Your data is commingled with thousands of other companies on shared servers. Someone could be targeting SSN or credit card numbers, maybe they are just grabbing all word documents, or files that include actionable intelligence. They can do mass collection and sweeping where your data is just collateral damage. Numerous companies have already been compromised because of the lacks data security policies of most cloud service providers. I haven’t even started talking about Google Drive and Microsoft one drive, both of which have a similarly atrocious security policy. They have a vested interest in being able to read your data, the advertising business model depends on it.

The thing is, there is a real solution to this problem. Cryptography. Strong user-controlled cryptography, that denies third party organizations any access to the underlying data. SenderDefender is built on those principles, that nothing short of mathematical security can keep people out of your private data. As we consolidate onto cloud services this is becoming increasingly important. Why would you trust Amazon or Dropbox or Google with your private legal documents, personal information, medical history? Yet people do, every single day, because they have masterfully blurred the lines. There is a perception that your data is private, when in fact it is completely open, accessible, search-able, and indexed. We need to support next generation services that provide convenience and ease of use, while also taking a real stance on these issues.

I’ve said it before but its time for a corporate version of megaupload, a completely opaque, well integrated and seamless encryption experience for every day users that keep their data out of the prying eyes of government, malicious hackers, and profiteers. We are tantalizingly close to being able to realize that experience, but consumers need to be educated to the risks involved when they use online storage services, and the potential monetary benefits of keeping full control of their information.

7 transactions per second. That is the current limit of the Bitcoin protocol and network. As a point of comparison the Mastercard network processes roughly 1200 transactions per second at peak, at 120 milliseconds a transaction, and that pales in comparison to a high frequency trading system which might process tens of thousands of transactions per second with turn-around times in the single digit microseconds. Clearly there is a technology gap between the best centralized processing systems and our decentralized wunderkind Bitcoin. To be fair the playing field isn’t level. A centralized system has massive advantage in terms of inter-node connectivity, and is able to leverage advanced high speed interconnects, directly accessible remote memory, and extremely fast host IPC (inter process communication). Bitcoin must deal with low bandwidth links, untrustworthy actors, intermittent connectivity and a host of other difficult scalability challenges.

Bitcoin’s limitations are several fold. First we have a tiny block size vs the capabilities of modern networks, currently 1 Megabyte. There are only so many transactions that can fit into a single 1 megabyte chunk of data. This puts a hard cap on the amount we can process in a ten minute window. As the popularity of the protocol grows and second generation systems are built on top of it we will begin to feel the pain. There are two relatively simple ways to address this. First, lets fit more transactions into existing blocks. I’ve done some analysis of the blockchains 45 million + transactions at the time of this writing. The vast majority of them are standardized P2pkh scripts which have a standardized format, written here in Erlang bitsyntax.

<<?OP_DUP:8, ?OP_HASH160:8, 16#14:8, Pubkey:160/bitstring, ?OP_EQUALVERIFY:8, ?OP_CHECKSIG:8>>

Seems to me that for standard transaction types we could just have a placeholder that knocks off 4 bytes per transaction. The pubkey can actually be uncompressed or compressed, which means its possible to insert 512 bit pubkeys into standard transactions. Let’s get rid of that also, and stick with only compressed keys. Finally a single public key could be used multiple times across the blockchain, or even within a single block, perhaps we need to have a lookup table functionality or OP_Codes which let us reference elements of other transactions.

Taking it one step farther, maybe it is possible to adapt an existing protocol like Fix/Fast but specifically built for Blockchain use. A presence map per transaction might allow us to codify entire sets of transactions without explicitly including the scripts. That protocol had similar design constraints in that minimal amounts of bandwidth are consumed to decrease transmission latency at the expense of decoding and encoding complexity. The goal of this complexity is simply to cram more transactions into a single block without necessarily increasing the block size.

The second problem is that miners are incentivized to push smaller blocks. The block reward far outweighs the transactional awards, and there is a computational advantage to publishing a block that gets accepted by the majority of the network faster. This is because the propagation delay of larger blocks could potentially result in an orphan. Orphaned blocks are worth approximately nothing, and anything that can be done to reduce the orphan rate will positively affect mining revenue. This problem only becomes more pernicious as the block size grows and the transaction count increases. I think we need to change the conditions of the race such that within a time threshold an alternate winner can steal the mining reward if it has more higher value transactions included in the block. Granted this might cause block-stuffing in order to ensure maximum rewards, but it would even the playing field in some respects. It makes sense to both reap more mining fees and have a better chance at the block reward.

We can increase the block size substantially so that more transactions can be shoved into each block, assuming we can discencentivize pushing smaller blocks. This might have the affect of increasing centralization as slower less well connected nodes get pushed off the network or unable to keep up with the transactional volume.