OpenAI is developing an open-source tool to identify how an LLM's behaviors relate to different parts of its code. This is part of a wider effort to anticipate and mitigate problems with AI systems. While the tool could potentially improve an LLM's performance, it is still in its early stages and has a way to go before being truly useful. The code for the tool is available on GitHub.
Amazon is developing new features for its Astro home robot. Project Burnham adds conversational abilities and intelligence to Astro and enables the robot to remember what it saw and understood. Astro will be able to engage in Q&A dialogues, follow up on previous interactions, and take appropriate actions. It will understand everyday activities in a home setting by leveraging common-sense knowledge derived from language model training. Burnham may be incorporated into future robots beyond Astro.
Humane has developed an AI-powered wearable technology designed to replace the traditional smartphone. The device, a small black puck with a camera, projector, and speaker, can answer phone calls, translate languages, provide advice, and more. The wearable gadget operates independently without needing pairing with another device such as a smartphone. A video showing Humane's co-founder demonstrating the device is available in the article.
People can now communicate while lucid dreaming. Lucid dreaming is when the dreamer can manipulate what happens in their dream. Sleep expert Michael Raduga of Phase Research Center has developed Remmyo, a language for communicating while lucid dreaming. Remmyo consists of six sets of facial movements which are detected by electromyography sensors on the face. These signals are transferred to software that can type, vocalize, and translate Remmyo.
Python has an extensive standard library with many modules and functions included. However, there are also important third-party libraries that should be used in all Python projects. This article lists essential Python libraries that should be used in all Python projects. It covers libraries for general use, data validation, debugging, testing, and more.
llm is a Rust ecosystem made of libraries that help in performing inference on large language models. It is based on llama.cpp. llm includes a wrapper around llm-base and supported models and a CLI application that offers a user-friendly interface for running inference on the supported models. Inference can currently only be done on CPU, but the ecosystem aims to also include GPU inference in the future. llm supports GPT-2, GPT-J, LLaMA, GPT-NeoX, StableLM, Dolly v2, and BLOOMZ.
Companies are reverting to traditional cost-cutting measures to maintain profits and satisfy shareholders. A number of employers have either frozen their hiring plans or cut employees. Remaining employees' benefits have been slashed, and worker reviews have become harsher, reflecting a less friendly workplace environment. The era of lavish employee benefits granted during the Age of the Worker appears to be ending.
Wendy’s will open a new AI-powered drive-thru in Columbus, Ohio, in June. Wendy’s FreshAI will be powered by Google’s AI chatbot service. The chatbot will use voice recognition to take orders from customers. A human employee will monitor the service to ensure that it is working correctly. Customers will have the option to speak with a human if needed. Wendy's is not planning to replace existing workers with the technology.
This article contains a transcript of an interview with Brian Chesky, CEO of Airbnb, where he discusses Airbnb's hybrid work policy, Rooms, Airbnb's partnership with Jony Ive, and more.
Anthropic's Constitutional AI seeks to guide the outputs of AI language models in a subjectively safer and more helpful direction by training it with an initial list of principles.
Get the most important tech, science, & coding news in a free daily email. Read by +1,250,000 software engineers and tech workers.