Big Tech &
Google trained a trillion-parameter AI language model (5 minute read)
Generally speaking, the more parameters a machine learning model has, the better it is at its task. OpenAI's GPT-3, which can make primitive analogies, generate recipes, and even complete basic code, has 175 billion parameters. Google researchers have trained a language model containing 1.6 trillion parameters. It performs four times faster compared to the previously largest Google-developed language model. The Switch Transformer technique was used to train the model as large-scale training is extremely computationally intensive.
Dropbox to cut 11% of its global workforce (1 minute read)
Dropbox is letting go of 315 employees, about 11% of the company. Its chief operating officer will leave the company on February 5. Dropbox shifted to remote work in October and plans to continue to do so after the pandemic ends. It will open Dropbox Studios in San Francisco, Seattle, Austin, and Dublin for its employees to work together in person when it is safe to do so. Dropbox's stocks have dropped by more than 6% since Wednesday.
Is this Chinese prototype the shape of maglev train tech to come? (2 minute read)
Researchers at Southwest Jiaotong University in China have unveiled a new maglev vehicle designed to travel at up to 620km/h. It is part of China's plan to create faster links between cities. The new maglev train uses liquid nitrogen to achieve superconductivity, which slashes operation costs to one-fiftieth compared to using liquid helium. It is able to levitate from a standing start. There are still issues to work out before the new technology becomes commercially viable. Images of the train are available in the article.
The UK Is Developing Nuclear-Powered Space Exploration for Faster Mars Trips (2 minute read)
The UK Space Agency and Rolls-Royce plc have teamed up to explore nuclear-powered propulsion for space exploration. Nuclear-powered propulsion would make space travel faster and more sustainable. It would make it possible to send missions further out into space where solar power is unavailable. Faster travel times would mean that astronauts would be exposed to less radiation. The research will last for six months, and then the results will be implemented throughout the decade.
Programming, Design & Data Science
JuiceFS (GitHub Repo)
JuiceFS is a POSIX file system built on top of Redis and object storage. It serves as a stateless middleware that enables many applications to share data easily. JuiceFS is fully POSIX-compatible and cloud-native. It features outstanding performance, file sharing, global file locks, and data compression.
tmpsms (GitHub Repo)
tmpsms generates a temporary phone number for receiving SMSes. It was designed for bug bounty hunters who don’t want to use their personal phone numbers. tmpsms can be integrated with other scripts to improve workflow.
New Research Could Enable Direct Data Transfer From Computers to Living Cells (4 minute read)
DNA could be the future of data storage. A single gram of DNA can store up to 215 million gigabytes of data and keep it stable for extremely long periods, especially if it is stored properly. Learning to store data in DNA could lead to new capabilities in biotechnology. A new study has demonstrated that it is possible to directly convert digital electronic signals into genetic data stored in the genomes of living cells. The technique can so far only encode tiny amounts of data, but it is still a proof of concept, and the researchers claim it could be easily improved by several orders of magnitude. Data stored in cells remained stable for between 60 to 80 generations.
No TLDR Originals for 2021-01-14