Skip to main content

OpenAI Starts Offering a Biology-Tuned LLM

2 weeks ago
An anonymous reader quotes a report from Ars Technica: On Thursday, OpenAI announced it had developed a large language model specifically trained on common biology workflows. Called GPT-Rosalind after Rosalind Franklin, the model appears to differ from most science-focused models from major tech companies, which have generally taken a more generic approach that works for various fields. In a press briefing, Yunyun Wang, OpenAI's Life Sciences Product Lead, said the system was designed to tackle two major roadblocks faced by current biology researchers. One is the massive datasets created by decades of genome sequencing and protein biochemistry, which can be too much for any one researcher to take in. The second is that biology has many highly specialized subfields, each with its own techniques and jargon. So, for example, a geneticist who finds themselves working on a gene that's active in brain cells might struggle to understand the immense neurobiological literature. Wang said the company had taken an LLM and trained it on 50 of the most common biological workflows, as well as on how to access the major public databases of biological information. Further training has resulted in a system that can suggest likely biological pathways and prioritize potential drug targets. "We're connecting genotype to phenotype through known pathways and regulatory mechanisms, infer likely structural or functional properties of proteins, and really leveraging this mechanistic understanding," Wang said. To address LLMs' tendencies toward sycophancy and overenthusiasm, OpenAI says it has tuned the model to be more skeptical, so it's more likely to tell you when something is a bad drug target. There was a lot of talk about GPT-Rosalind's "reasoning" and "expert-level" abilities. We were told that the former was defined as being able to work through complex, multi-step processes, while the latter was derived from the model's performance on a handful of benchmarks. Access to GPT-Rosalind is currently limited "due to concerns about the model's potential for harmful outputs if asked to do something like optimize a virus's infectivity," notes Ars. Only U.S.-based organizations can request access at the moment.

Read more of this story at Slashdot.

BeauHD

Microsoft Increases the FAT32 Limit From 32GB To 2TB

2 weeks ago
Longtime Slashdot reader AmiMoJo writes: Windows has limited FAT32 partitions to a maximum of 32GB for decades now. When memory cards and USB drives exceeded 32GB in size, the only options were exFAT or NTFS. Neither option was well supported on other platforms at first, although exFAT support is fairly widespread now. In their latest blog post, Microsoft announced that the limit for FAT32 partitions is being increased to 2TB. Of course, that doesn't mean that every device that supports FAT32 will work flawlessly with a 2TB partition size, but at least there is a decent chance that older devices with don't support exFAT will now be usable with memory cards over 32GB.

Read more of this story at Slashdot.

BeauHD