An anonymous reader shares a report: In 2018, several high-profile controversies involving AI served as a wake-up call for technologists, policymakers, and the public. The technology may have brought us welcome advances in many fields, but it can also fail catastrophically when built shoddily or applied carelessly. It's hardly a surprise, then, that Americans have mixed support for the continued development of AI and overwhelmingly agree that it should be regulated, according to a new study from the Center for the Governance of AI and Oxford University's Future of Humanity Institute. These are important lessons for policymakers and technologists to consider in the discussion on how best to advance and regulate AI, says Allan Dafoe, director of the center and coauthor of the report. "There isn't currently a consensus in favor of developing advanced AI, or that it's going to be good for humanity," he says. "That kind of perception could lead to the development of AI being perceived as illegitimate or cause political backlashes against the development of AI."
The billions of bacteria that call your gut home may help regulate everything from your ability to digest food to how your immune system functions. But scientists know very little of how that system, known as the microbiome, changes over time—or even what a “normal” one looks like. Now, researchers studying the gut bacteria of thousands of people around the globe have come to one conclusion: The microbiome is a surprisingly accurate biological clock, able to predict the age of most people within years.
To discover how the microbiome changes over time, longevity researcher Alex Zhavoronkov and colleagues at InSilico Medicine, a Rockville, Maryland–based artificial intelligence startup, examined more than 3600 samples of gut bacteria from 1165 healthy individuals living across the globe. Of the samples, about a third were from people aged 20 to 39, another third were from people aged 40 to 59, and the final third were from people aged 60 to 90.
The scientists then used machine learning to analyze the data. First, they trained their computer program—a deep learning algorithm loosely modeled on how neurons work in the brain—on 95 different species of bacteria from 90% of the samples, along with the ages of the people they had come from. Then, they asked the algorithm to predict the ages of the people who provided the remaining 10%. Their program was able to accurately predict someone’s age within 4 years, they report on the preprint server bioRxiv. Out of the 95 species of bacteria, 39 were found to be most important in predicting age.
Zhavoronkov and his colleagues found that some microbes became more abundant as people aged, like Eubacterium hallii, which is thought to be important to metabolism in the intestines. Others decreased, like Bacteroides vulgatus, which has been linked to ulcerative colitis, a type of inflammation in the digestive tract. Changes in diet, sleep habits, and physical activity likely contribute to these shifts in bacterial species, says co-author Vadim Gladyshev, a Harvard University biologist who studies aging.
Zhavoronkov says this “microbiome aging clock” could be used as a baseline to test how fast or slow a person’s gut is aging and whether things like alcohol, antibiotics, probiotics, or diet have any effect on longevity. It could also be used to compare healthy people with those who have certain diseases, like Alzheimer’s, to see whether their microbiomes deviate from the norm.
If the idea is validated, it would join other biomarkers scientists use to predict biological age, including the length of telomeres—the tips of chromosomes implicated in aging—and changes to DNA expression over a person’s lifetime. Combining the new aging clock with these others could yield a much more accurate picture of a person’s true biological age—and health. It could also help researchers better test whether certain interventions—including drugs and other treatments—have any effect on the aging process. “You don’t need to wait until people die to conduct longevity experiments,” Zhavoronkov says.
The idea that you can predict someone’s age based on their gut microbiome is “very plausible” and of “tremendous interest” to scientists studying aging, says computer scientist and microbiome researcher Robin Knight, director of the Center for Microbiome Innovation at the University of California, San Diego. His group is analyzing 15,000 samples from the American Gut Project, a worldwide microbiome study he founded, to develop similar age predictors.
But one of the challenges of developing such a clock, he adds, is that there are huge differences in which bacteria are present in the guts of people around the world. “It’s extremely important to replicate these kinds of studies with markedly different populations” to find out whether there are distinct signs of aging in different groups of people, Knight says.
He says it’s also not known whether changes in the microbiome cause people to age more rapidly, or whether the changes are simply a side effect of aging. InSilico Medicine is building several aging clocks based on machine learning that could be combined with the microbiome one. “Age is such an important parameter in all kinds of diseases,” Zhavoronkov says. “Every second we change.”
Jason C. Brown is CEO and co-founder of Tally, the world’s first automated debt manager.
The unbundling of the bank has begun.
Just 10 years ago, the average consumer had very few financial relationships and interacted with just one or two institutions to fulfill all of their financial needs. But fintech companies are breaking up the old guard by focusing on specific things that banks have done and simply doing them better. As a result, the average consumer now has numerous financial relationships, each with a clear-cut purpose.
The fintech revolution started after the 2008 financial crisis, and was driven largely out of frustration with the existing establishment. Facing heavy scrutiny, banks pulled back dramatically on a lot of their activities to reduce risk, which left a significant gap in the marketplace. Fintech companies stepped in and brought new ideas to an industry that had seriously lacked innovation. But now that the economy has rebounded, banks are aggressively running straight into that gap to recapture what they lost.
The established banks are focused on copying the best of what fintech has to offer. They’re moving slowly and are a solid five years behind, but their goal is to provide a just-good-enough mobile experience to ensure their customers stay with them. Banks know they don’t need to be better than the fintech companies; their advantages of scale and distribution ensure they can maintain their substantial customer base with a sufficient product.
Those advantages prevent fintech companies from truly competing against banks. If a bank really wants to be in a certain business, it can dominate a fintech company every single day because it has lower cost of funds and can afford to pay more per customer. That makes me generally pessimistic about any fintech company whose only wedge is serving a market that banks don’t serve. Most of those companies will find themselves unable to grow beyond a certain level in the long-term because they will be copied by the establishment.
Thinking about how to stay relevant as a fintech company, the only defensible, long-term strategy is driven by automation.
Automation is the ultimate reduction in friction because it allows optimizations to happen perpetually.
The next 20 years are going to be defined by the way automation transforms the average person’s life. An intelligent service will make, and then execute, most of an individual’s financial decisions in the not-so-distant future. That service will collaborate with the person to understand their human objectives — when they want to retire or where they can afford to send their children to college — and use its super intelligence and its ability to execute things in microseconds over and over to put the entire financial system to work for the person. The individual may not understand how or why the intelligent service is doing all of these things, but he or she knows the actions are completely in the service of improving his or her life.
Imagine a scenario where a person ports their entire financial profile wherever they want it. With the push of a button, all of their accounts are transferred from one place to another, much like porting a phone number.
The cellphone industry, for example, fought very hard to prevent the porting of numbers because not allowing it created stickiness. That stickiness reduced people’s willingness to switch carriers, which allowed the carriers to charge higher prices. In 2003, when the government forced the industry to allow the porting of phone numbers, cellphone plan prices went down. Excess profits evaporated when this friction was eliminated.
Automation is the ultimate reduction in friction because it allows optimizations to happen perpetually. Automation allows optimizations to happen at zero marginal cost. Automation allows optimizations to happen without human involvement, and when you’re able to do that, the customer is always matched with the ideal financial situation.
This is a nightmare scenario for banks: Once automation reduces enough friction in the financial industry, banks lose their relationships with customers. They become a utility; a provider of pipes and wires that allow money to be stored and moved from place to place. Then, specialized fintech companies swoop in and use their data expertise to make decisions for people and execute on those decisions. The end result is an invisible, intelligent service that figures out everything for the customer and does it for them.
In this sense, the power of automation goes beyond an intelligent service’s ability to decide what’s best and take action on behalf of a customer. Automation’s ability to reduce friction allows for a more competitive market, and those actions can create additional wealth for the customer by matching them with the best available product in the marketplace.
Figuring out how to weave intelligent automation into a product experience, a manufacturing process or a product development process is crucial to growth and success for fintech companies. Those that fail to recognize the changing technological landscape run the risk of losing their market share and their position in the marketplace.
Using nothing but your computer, some software, and a $20 radio dongle you can receive transmissions from NOAA weather satellites in the sky overhead. This is an incredibly exciting project that’s easy to do but produces great images. Think about it — you can receive images from a satellite almost 1000KM straight above you!
NOAA operates a series of weather satellites, designated NOAA 15, NOAA 18, and NOAA 19, that circle the Earth in polar orbits. These satellites are particularly interesting because they’re constantly transmitting an easy-to-decode image signal of what they’re currently looking at. The satellites’ orbits are configured such that one of the three passes within almost every point on the globe every three hours.
You can track each of the three here. The closer the yellow line is to your location when the satellite is overhead the higher the satellite will appear in the sky. Keep in mind that the satellites’ trajectories across the globe (the yellow lines) are constantly progressing sideways by a few hundred miles with each successive orbit. You don’t really need to worry about tracking them — I’ll soon introduce a program that calculates the flyovers for you.
These satellites transmit images at 137 MHz, which is relatively close to both FM radio transmissions (~90 MHz to 110 MHz) and 2-meter amateur radio transmissions (~144 MHz). The data itself is encoded using frequency modulation (FM), the same method that FM stations use to transmit audio. Much of the encoded information is in the range of human hearing, which is why you can actually convert the signals to audio and “hear” them!
The main things needed for listening to any radio signal are a reciever and an antenna.This awesome kit contains both! You can listen to anything from 500 kHz to 1.7 GHz with this hardware, covering interesting bands such as AM/FM radio, police radio frequencies, amateur radio frequencies, and so much more. Additionally, the included antenna setup is perfect for receiving data from weather satellites. Best of all, it’s under $30!
The kit contains a very common device branded RTL-SDR. Many programs support RTL-SDR devices, so you’ll find that software programs other than the ones I link work great.
I am writing many of the instructions below for computers running macOS, but I will also include information for Windows users. Basically any modern computer has the processing capability for this task.
The basic workflow for capturing NOAA satellite data is as follows:
Use a tuning program to configure the RTL-SDR hardware to receive audio at 137 MHz
Pipe this audio into an APT decoding program
Process the audio and post-process the image
The first thing to install is a piece of software that enables you to pipe audio between programs. Soundflower on macOS works well for this, as does VB-Audio on Windows. Install one or the other depending on your platform. If all goes well the new virtual audio device should appear in a list of input and output audio devices on your computer.
CubicSDR, for both macOS and Windows, is a wonderful program that makes using your RTl-SDR device very easy. Download it from here.
In order to verify that everything works you should first do some preliminary setup of the antenna. Screw the two longer telescoping antennas into the v-mount included with the kit. Attach one end of the long cable included in the kit to the output of the v-mount, then screw the other end into the exposed connector on the RTL-SDR device. Expand the antennas, and you’re good to go!
Plug the RTL-SDR into your computer, start CubicSDR, and the software should present you with a screen such as the one below:
Highlight the device called “Generic RTL….” and click start. You will then be presented with a waterfall view of the electromagnetic spectrum around you. Blue is the lowest signal level, ranging all the way up to red. The program defaults to centering on 100 MHz. You’ll likely find an FM station transmitting inside the default window; click on the center of one of the green bands and the station should start playing through the speakers on your computer (see the above image if you’re confused).
Next you need to install the program responsible for decoding the APT signals. By far the most common and most feature-rich program to do so is called WXtoImg. Unfortunately the program is no longer being developed, but you can download the last release of it here.
This is a great program that can both tell you when satellites will fly overhead and decode the audio coming into the program. Install it, and input your latitude and longitude information when prompted at setup. The first thing you’ll want to check out is the satellite flyover list. Click on File->Update Keplers, then select File->Satellite Pass List. A window similar to the one above should appear with the times of the satellite passes for the next week. Check under “Local Time” to see when the next one will occur!
An important thing to note here is the duration and altitude listed. With the antenna we’re going to use later on you want the satellite to be as close to directly overhead as possible. This will result in the least amount of noise in your image. For your first go it’s best to pick a pass where the satellite will be in the sky for at least 11:30 — there’s a positive relationship between this duration and the maximum angle that the satellite makes with the horizon as it passes. The longer the duration, the higher the maximum angle, the better your image. Below is an image I received on a pass with a 9-minute duration, which was clearly too low in the sky to be useful given my setup. Note that the map overlay is added in post.
If you see something like this at first don’t be discouraged! Those vertical bars in the middle and sides indicate that you are receiving transmissions from a satellite.
Now that you have your satellite pass information you can start working on your setup. The first thing you should try is linking all these programs together — I’ll explain how this works. Cubic SDR will take the raw information coming in from your RTL-SDR device, perform whatever demodulation you tell it to (in this case FM), and output it to whatever audio output you give it. WXToImg then takes this audio stream and converts it in real-time to an image.
First-off, start Cubic SDR with your RTL-SDR device attached. Pick any FM radio station, and make sure the audio coming out of your speakers sounds at least somewhat nice. You can change frequencies by either clicking on the waterfall plot, or changing things numerically in the window at the top-right. Now, click the audio-out dropdown in the top left corner and choose your virtual audio device that you previously setup. In my case, this was “Soundflower (2ch).”
You can then leave this in the background. Start WXToImg, and navigate to Options->Recording Options. Under “Common Recording Options” click on the “ soundcard” dropdown and select the same virtual audio device. Click Ok to confirm, then select File->Record.
When it comes time for the real satellite flyover you’ll be hitting Auto Record. This will start the recording and decoding process when the satellite first appears on the horizon and stop it when the satellite disappears (it figures this out the same way it figures out when the flyover will occur — it has nothing to do with the signal you’re feeding it). For now, try hitting Manual Test. Line by line, you should now see a bunch of static filling the screen. This is WXToImg attempting to convert the audio of the radio station to a weather satellite image. Naturally, this is just garbage!
If lines of black pixels fill the screen, you likely have your audio pipe configured incorrectly. The static indicates that there’s audio there for it to try and decode.
Now it’s time to set up your antenna! Extend your antennas so that they are exactly 53cm, and spread them 120 degrees apart. Place this outside on the ground such that the antennas are parallel with the ground, and point the center of the V northward.
You can get significantly improved pictures if you mount the antenna on something above the ground. I duct-taped mine to a 2x4 that raised it about 6 feet off the ground:
Check the pass list for the frequency of the next passing satellite, tune Cubic SDR to that frequency, and set the bandwidth to 40 kHz. Open WXToImg, select File->Record and then select Auto Record. You’re now ready to receive! When the satellite rises on the horizon you will steadily see the following signal emerge from the noise in Cubic SDR:
It really is quite amazing to watch for the first time. Check back in WXToImg to watch the image appear in real time. After recording is complete, the images should initially look like the picture below:
Inside of WXToImg you can add things such as map overlays and clean up the image. You can also play around with it to create false-color images, temperature maps, and more.
That’s all! If you have any questions post them below, and if you enjoyed this article you can view my website.
Investing in Blockchain companies is no different from investing in any other industry. To succeed, you need a thorough understanding of the space you plan to be in before you even think of allocating any capital toward a particular project or asset. It is not an accident that “do your own research” has become more and more of a chant for the Crypto niche in the Blockchain industry, over time. Following this mantra is what leads to rational investing and trading in any sphere. Below, we have compiled a list for you of what we believe are the must read books for any aspiring Blockchain and Crypto investor.
Cryptoassets: The Innovative Investor’s Guide to Bitcoin and Beyond
by Chris Burniske and Jack Tatar
Of all of the books about Blockchain, this book often comes up as an excellent guide to the entirely new and rapidly emerging asset class of Cryptocurrencies from industry experts. It gives a broad overview of the state of digital assets and is highly recommended if you want to understand where the financial world is moving to. If you’ve been in crypto for a while, it may also be a great review of the history of Cryptocurrencies, while if you’re new, you will gain valuable insights into investing in this space. On the road to making all of this clear, Burniske and Tatar highlight how to spot quality Cryptoassets, avoid hype and fraudulent projects, and compose and balance your portfolio. Cryptoassets is a must-read for beginning crypto-enthusiasts and experienced capital market investors alike who want to learn a portfolio strategy backed by the opinions of industry experts.
Digital Gold: Bitcoin and the Inside Story of the Misfits and Millionaires Trying to Reinvent Money
by Nathaniel Popper
This book has been treated as if it were the Bitcoin bible of sorts by many in the space. It is not difficult to find in any group of cryptophiles a handful of investors and traders who got their start in the space with this work and there is a reason for that. Nathaniel Popper followed the movers and shakers in the early days of the Bitcoin space in order to compile the definitive history of it in an engaging fashion. Here, with Popper’s engaging style of writing, it is fairly easy to understand the how and why of the rise of Bitcoin and the blockchain industry. This work is often considered as one of the first blockchain investment books that you should read before you dive deeper into a more technical analysis of the industry. Think about it like getting your feet wet before diving into the Crypto space.
The Internet of Money
by Andreas M. Antonopolous
If you are not already familiar with Andreas Antonopolous, then you should be. This particular book does an excellent job of answering the question of why Bitcoin has brought about a financial and technical revolution, with a potential that exceeds its original label of being a “digital currency.” In doing so, Andreas shows us how the rise of Bitcoin may be equated with the rise of the Internet. If one central theme can be taken from this work, it is that Blockchain is fundamentally changing our approach to solving problems of all kinds in all areas, due to its status as a decentralized technology.
Mastering Bitcoin: Unlocking Digital Currencies
by Andreas M. Antonopolous
Andreas did not stop at one seminal work for this space. In order to fully understand Crypto’s beginnings and possible future, you need to read both this and The Internet Money. The general utility of Mastering Bitcoin lies in the fact that programmers in the industry often consider it to be the best book on blockchain out there, from a technical standpoint. More specifically, if you want to understand anything with regard to how Cryptocurrency networks are programmed, then you should start here. As Bitcoin was the father of Cryptocurrencies, its code was effectively the father of the code of those who came after. Because of this, in a deep, technical sense, this may be the best book on Blockchain out there.
The Age of Cryptocurrency: How Bitcoin and Digital Money Are Challenging the Global Economic Order
by Paul Vigna and Michael Casey
The book dives into the questions of the ideological and technical roots of Cryptocurrencies from the point of view of journalists who have an extensive amount of experience in the financial industry. Vigna and Casey demystify the origins and functions of Cryptocurrencies, concluding with the central hypothesis that we will soon reach the age of a cyber-economy. Given this, one could also term this a guide to surviving in such a time. Along the way to reaching these conclusions, they take an unbiased view of the space, starting with Bitcoin, which may be particularly attractive to those of you who enjoy getting a balanced opinion on a new investment. Even though it is not necessarily the best book on Blockchain technology on a technical level, it is possibly the best for those looking for more of a history around the industry’s rise.
The Bitcoin Standard: The Decentralized Alternative to Central Banking
by Saifedean Ammous
The Bitcoin Standard may be considered to be on the top of the list of Blockchain investment books to read, just like the others. Chiefly, this is because it deeply analyzes the historical context of the rise of Bitcoin, the economic properties that have allowed it to grow quickly, and its likely economic, political, and social implications. Ammous recaps the history of technologies performing the functions of money, starting from primitive shells to modern government debt. With this background in place, he moves on to explain the operation of Bitcoin in a functional and intuitive way. Also, the book explores some of the most common questions like Is bitcoin wasting energy? Is it for criminals? Who controls it? Can it be killed?
Reminiscences of a Stock Operator
by Edwin Lefèvre
First published in 1923, while this is not a Blockchain book, this time tested book represents a brilliant take on crowd psychology and market timing, which is why we believe it is a must-read for all investors, whether new or experienced. Overall, regardless of the fact that Cryptocurrencies and Blockchain represent a new industry entirely, the advice contained in Lefèvre’s work is timeless, simply due to how people think about investing.
The Intelligent Investor: The Definitive Book on Value Investing
by Benjamin Graham
While this is not typically considered to be one of the best Blockchain or Crypto books, per se, Graham’s philosophy of ‘value investing’ is essential for any investor to know. In a nutshell, its utility lies in the fact that it helps investors mitigate against substantial errors in their strategies, while teaching them to develop long-term strategies. Graham’s vision has made this book the stock market bible ever since its original publication in 1949. This particular work is also widely renowned for the praise it received from Warren Buffett himself. Do a quick Google search anywhere for the best blockchain investing books and you will find this in almost every list compiled. While this may not be one of the typical best blockchain investment books to read, it is essential, especially for those coming into this new industry with little to no investment experience. Just as Digital Gold is a kind of Bitcoin bible, consider this to be a useful candidate for a general investing bible.
An Altcoin’s Trader’s Handbook
by Nik Patel
The book details 5 years of experience of a specific cryptocurrency trader and his efforts in putting together what he claims to be a comprehensive strategy for profitable altcoin speculation. The first part is centered around his personal life. The second hones in on practical trading strategies, including fundamental and technical analysis, with detailed charts and extra notes to illustrate key points. While this may not lie under the umbrella of books about blockchain, it will perfectly fit your needs if you are a trader interested in Altcoins with small market capitalizations. Even so, it is important to mention here that with any work that claims to provide insider information on investing, it is necessary to verify these claims with your own external research.
In the end, we hope that this list of books on blockchain investments starts you on your journey to being a confident Crypto investor. Remember that doing your own research does not stop here. This is only the beginning. Last but not least, if you find any Crypto or Blockchain book that you think deserves to be here, let us know.
Data visualizations for text: How to show the process of writing with the writing graph
Scroll to the bottom of this story for a link to the original post.
TEXT EDITORS (and the files they work) reveal surprisingly little about the history of editing. If you’re lucky, you get revisions to browse, and if not, you get undo/redo buttons. By adding temporal metadata to files, apps can display more than just the product — they can show process. This post introduces the writing graph, a timeline for viewing editing activity. A proof of concept below shows how new media artists, reflective writers and even casual readers can use this text visualization to learn more about what they’re reading.
THE HISTORY OF MANIPULATING TEXT is rich with innovation (e.g., water-soluble cave paint, clay tablets, printing presses, copy & paste, 💩, etc.). But for the most part, the narrative converges on a digital standard of the late 20th century: Adding and removing characters via a caret in a sequence of lines we call, ever so passionately, the “document” (metaphorical baggage included).
A quick preface: In all the data vis courses I’ve taught, there have always been students eager to share “novel” visualizations. These are typically tweaks on canonical visualizations, or maybe even complex combinations thereof (like a scatterplot but where it’s in 4D and the scattering is of small multiples of non-geographical choropleths). Unsurprisingly, many researchers have spent a lot of time refining the art of visual communication; visualizations are like wood joints — it’s good to have an optimistic suspicion of anything “new”. Instead of proposing an alternative way to represent data, I’m only applying standard techniques to new content.
Having said that, I think there is a dearth of text visualizations. Sure, there’s the somewhat(in)famous “word cloud” that maps word count to font size, and apps like iA Writer that gracefully color words by part-of-speech, and a lot of greatacademicprojects, but not so much more (compared to other domains, at least). One reason is the ubiquity of plain text in operating systems and inter-app communication (which markdown embraces). Another is, “if it ain’t broke, don’t fix it”.
But innovation is good, and I know that a growing community of new media artists and creative coders would love tools that felt less utilitarian (e.g., Word) and more exploratory (e.g., Max); as a generative writer once told me, “musicians are spoiled”. It’s good to remember that niche technologies originally designed for expert needs (e.g., hands-free voice recognition for jet fighters) often find their way to the rest of us (e.g., Siri for commuters).
WHO WOULD WANT to see this dimension of the writing process? As a former cognitive scientist, I can tell you psychologists happen to be pretty interested in processes; they love timing participants because it helps them infer mental processes, like whether people are thinking fast or slow. In nearly every study I’ve done, response time was a dependent measure (how long does it takes you to make a moral judgment, solve a puzzle, foveate on a target, etc.) If you ever participate in an experiment, you can assume everything you do — including waiting for the “experiment” to start — is being timed (but if you tell your researcher this, they may have to toss your data due to bias). It should be said that scientists, like designers, know that a measure like time-to-completion tells you nothing more than time-to-completion; you might have paused on a word because you were conjuring synonyms, or you might have been distracted by a notification to somehow appease a social network. That’s why experiments often analyze groups of people over multiple trials — to “wash out the noise” of any given individual or situation.
The reason I was interested in seeing this context came after spending a little time with somepoets at Brown. I was working on a new text-editor at the time, and the designer in me was intrigued to learn more about writers’ process, to look inside their work. In the spirit of transgression, here are some crazy 8’s for motivation:
Giving readers x-ray vision into the streams of consciousness in a poem, which are typically only conveyable in a live “writing” or demonstration (i.e., performance art).
Proving to oneself that with practice, journal entries takes less and less time to write (but what about blog posts?).
Showing the ebbs & flows of a love letter (or Tinder message).
Identifying moments of hesitation in a doctor’s note (don’t hold your breath; I trust Epic has more basic priorities in their hierarchy of needs).
Improving my typing by reporting my slowest words/characters.
Seeing how quickly POTUS adds tweets to the presidential archive.
Tracking when a particular line of code was not only committed — but authored in an IDE.
Confirming your suspicion that research articles rush through their conclusions because authors are drained after typing up results.
Obviously, not everyone wants to be reminded of their writing habits. Even someone looking over your shoulder can be paralyzing. Needless to say, in most cases this information should only be shown on demand, and writers should be able to disable it altogether.
THE APPROACH is simple. I draw a rectangle under each glyph, the height of which represents how long it’s been since the last activity. Define activity however you want (e.g., last click, last edit, etc.). Scale the bars however you like (mine are linear and capped at 1000ms).
You can imagine complicating this ad infinitum: coloring the bars by the color of the sky at the time & location of the keystroke; normalizing the bars by difficulty of reaching for different keys on desktop/mobile; showing total time per document/paragraph/line/word/character; etc.
You can also imagine looking beyond typing (there are other ways to add/remove characters), like showing how a paragraph and its alternatives were copyedited, or how the complex web of undo/redo was collapsed. Some techniques will be more expensive, requiring a re-engineering of low-level functions (e.g., rendering text layout, handing selections, etc.) — non-trivial. And some will take a toll on familiarity and/or learnability. But that’s where prototyping and your own design sensibility comes in. (Or stop scrutinizing every datum and embrace the zen of less-is-more).
IN UPCOMING POSTS, I’ll discuss some other topics (and visualizations) in text. If you’re interested in using any of these visualizations in your work, or in a text editor/word processor/infographic/etc., send me an email.