{"id":212586,"date":"2024-03-12T17:48:07","date_gmt":"2024-03-12T17:48:07","guid":{"rendered":"https:\/\/michigandigitalnews.com\/index.php\/2024\/03\/12\/ai-powered-robot-maker-covariant-debuts-rfm-1-a-robot-language\/"},"modified":"2025-06-25T17:20:46","modified_gmt":"2025-06-25T17:20:46","slug":"ai-powered-robot-maker-covariant-debuts-rfm-1-a-robot-language","status":"publish","type":"post","link":"https:\/\/michigandigitalnews.com\/index.php\/2024\/03\/12\/ai-powered-robot-maker-covariant-debuts-rfm-1-a-robot-language\/","title":{"rendered":"AI-powered robot maker Covariant debuts RFM-1, a &#8216;robot language&#8217;"},"content":{"rendered":"<p> [ad_1]<br \/>\n<br \/><img decoding=\"async\" src=\"https:\/\/readwrite.com\/wp-content\/uploads\/2024\/03\/bbe0d027fe82f13e1f5eda4efef6adcc2c87741f-3170x1772-ezgif.com-webp-to-jpg-converter-900x503.jpg\" \/><\/p>\n<div>\n<p>Covariant has announced the launch of an <a href=\"https:\/\/readwrite.com\/category\/ai\/\">artificial intelligence<\/a> (AI) program with a difference, described as what is \u201cbasically a <a href=\"https:\/\/readwrite.com\/ai-chatbots-think-in-english-research-finds\/\">large language model<\/a> (LLM), but for robot language.\u201d<\/p>\n<p>Peter Chen, co-founder and CEO of the Berkeley, California-based company spoke to <a href=\"https:\/\/techcrunch.com\/2024\/03\/11\/covariant-is-building-chatgpt-for-robots\/\">TechCrunch<\/a> to introduce RFM-1, and the aim of \u201cgiving robots human-like reasoning capabilities.\u201d<\/p>\n<p>A <a href=\"https:\/\/covariant.ai\/insights\/introducing-rfm-1-giving-robots-human-like-reasoning-capabilities\/\">press release<\/a> detailed the full development of the foundation model which Covariant wants to utilize to deliver the next phase of robotics beyond carrying out fairly basic, repeated tasks.<\/p>\n<p>RFM-1 has been delivered thanks to a vast collection of data gathered from Covariant\u2019s Brain AI program, which has been harnessed with customer consent to build up a ChatGPT for robots.<\/p>\n<p>\u201cThe vision of RFM-1 is to power the billions of robots to come,\u201d said Chen.<\/p>\n<p>\u201cWe at Covariant have already deployed lots of robots at warehouses with success, but that is not the limit of where we want to get to. We really want to power robots in manufacturing, food processing, recycling, agriculture, the service industry and even into people\u2019s homes.\u201d<\/p>\n<blockquote class=\"twitter-tweet\" data-width=\"550\" data-dnt=\"true\">\n<p lang=\"en\" dir=\"ltr\">Very excited to share what our research team at Covariant has been working on: RFM-1, our latest Robotics Foundation Model. Built on top of high-quality multimodal data the Covariant robotics fleet has collected over the years, this project embodies our commitment to pushing the\u2026 <a href=\"https:\/\/t.co\/RoVh55BVKR\">pic.twitter.com\/RoVh55BVKR<\/a><\/p>\n<p>\u2014 Rocky Duan (@rocky_duan) <a href=\"https:\/\/twitter.com\/rocky_duan\/status\/1767247209458978875?ref_src=twsrc%5Etfw\">March 11, 2024<\/a><\/p>\n<\/blockquote>\n<h2>Will human job functions be impacted?<\/h2>\n<p>With human-like reasoning, comes risks to human functions. There will be significant challenges to come, in line with the advances of AI. If robots are<a href=\"https:\/\/readwrite.com\/figure-ai-robotics-firm-sign-major-deal-with-bmw\/\"> to be scaled up for human job roles<\/a> like those listed above, companies like Covariant will need to be part of the conversation as to how the human workforce can be upskilled and prepared for a different future.<\/p>\n<p>Chen tried to allay fears on just what is meant by human-like reasoning, in this instance it\u2019s the ability of the robots to process real-time information and choose the best course of action for the relevant task.<\/p>\n<p>The CEO tempered that his firm is not quite at the stage they are ultimately aiming for.<\/p>\n<p>\u201cWe do like a lot of the work that is happening in the more general-purpose robot hardware space,\u201d Chen explained.<\/p>\n<p>\u201cCoupling the intelligence inflexion point with the hardware inflexion point is where we will see even more explosion of robot applications but a lot of those are not fully there yet, especially on the hardware side. It\u2019s very hard to go beyond the\u00a0staged video. How many people have interacted with a humanoid in person? That tells you the degree of maturity.\u201d<\/p>\n<p><em>Image credit: <a href=\"https:\/\/www.pexels.com\/photo\/grayscale-of-futuristic-dancing-robot-8294624\/\">Covariant.AI<\/a><\/em><\/p>\n<\/p><\/div>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><br \/>\n<br \/>[ad_2]<br \/>\n<br \/><a href=\"https:\/\/readwrite.com\/ai-powered-robot-maker-covariant-debuts-rfm-1-a-robot-language\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[ad_1] Covariant has announced the launch of an artificial intelligence (AI) program with a difference, described as what is \u201cbasically a large language model (LLM),<\/p>\n","protected":false},"author":1,"featured_media":212587,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[152],"tags":[],"_links":{"self":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/212586"}],"collection":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/comments?post=212586"}],"version-history":[{"count":3,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/212586\/revisions"}],"predecessor-version":[{"id":338428,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/212586\/revisions\/338428"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media\/212587"}],"wp:attachment":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media?parent=212586"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/categories?post=212586"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/tags?post=212586"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}