{"id":250103,"date":"2024-08-01T12:54:24","date_gmt":"2024-08-01T12:54:24","guid":{"rendered":"https:\/\/michigandigitalnews.com\/index.php\/2024\/08\/01\/openai-vows-to-provide-the-us-government-early-access-to-its-next-ai-model\/"},"modified":"2025-06-25T17:13:25","modified_gmt":"2025-06-25T17:13:25","slug":"openai-vows-to-provide-the-us-government-early-access-to-its-next-ai-model","status":"publish","type":"post","link":"https:\/\/michigandigitalnews.com\/index.php\/2024\/08\/01\/openai-vows-to-provide-the-us-government-early-access-to-its-next-ai-model\/","title":{"rendered":"OpenAI vows to provide the US government early access to its next AI model"},"content":{"rendered":"<p> [ad_1]<br \/>\n<\/p>\n<div>\n<p>OpenAI will give the US AI Safety Institute early access to its next model as part of its safety efforts, Sam Altman has revealed in a tweet. Apparently, the company has been working with the consortium &#8220;to push forward the science of AI evaluations.&#8221; The National Institute of Standards and Technology (NIST) has formally established the Artificial Intelligence Safety Institute earlier this year, though Vice President Kamala Harris <a data-i13n=\"cpos:1;pos:1\" href=\"https:\/\/www.engadget.com\/kamala-harris-announces-ai-safety-institute-to-protect-american-consumers-060011065.html\" data-ylk=\"slk:announced;cpos:1;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">announced<\/a> it back in 2023 at the UK AI Safety Summit. Based on the <a data-i13n=\"cpos:2;pos:1\" href=\"https:\/\/www.nist.gov\/aisi\/artificial-intelligence-safety-institute-consortium-aisic\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:NIST's description;cpos:2;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">NIST&#8217;s description<\/a> of the consortium, it&#8217;s meant &#8220;to develop science-based and empirically backed guidelines and standards for AI measurement and policy, laying the foundation for AI safety across the world.&#8221;<\/p>\n<p>The company, along with DeepMind, similarly pledged to share AI models <a data-i13n=\"cpos:3;pos:1\" href=\"https:\/\/www.engadget.com\/google-openai-will-share-ai-models-with-the-uk-government-134318263.html\" data-ylk=\"slk:with the UK government;cpos:3;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">with the UK government<\/a> last year. As <a data-i13n=\"cpos:4;pos:1\" href=\"https:\/\/techcrunch.com\/2024\/07\/31\/openai-pledges-to-give-u-s-ai-safety-institute-early-access-to-its-next-model\/\" data-ylk=\"slk:TechCrunch;cpos:4;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \"><em>TechCrunch<\/em><\/a> notes, there have been growing concerns that OpenAI is making safety less of a priority as it seeks to develop more powerful AI models. There were speculations that the board decided to kick Sam Altman out of the company \u2014 he was very quickly <a data-i13n=\"cpos:5;pos:1\" href=\"https:\/\/www.engadget.com\/sam-altman-is-reinstated-as-openai-ceo-five-days-after-being-fired-070037749.html\" data-ylk=\"slk:reinstated;cpos:5;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">reinstated<\/a> \u2014 due to safety and security concerns. However, the company <a data-i13n=\"cpos:6;pos:1\" href=\"https:\/\/www.engadget.com\/internal-memo-says-sam-altmans-firing-wasnt-due-to-malfeasance-or-openai-safety-practices-205156164.html\" data-ylk=\"slk:told staff members;cpos:6;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">told staff members<\/a> in an internal memo back then, that it was because of &#8220;a breakdown in communication.&#8221;<\/p>\n<p>In May this year, OpenAI admitted that it <a data-i13n=\"cpos:7;pos:1\" href=\"https:\/\/www.engadget.com\/the-openai-team-tasked-with-protecting-humanity-is-no-more-183433377.html\" data-ylk=\"slk:disbanded the Superalignment team;cpos:7;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">disbanded the Superalignment team<\/a> it created to ensure that humanity remains safe as the company advances its work on generative artificial intelligence. Before that, OpenAI co-founder and Chief Scientist Ilya Sutskever, who was one of the team&#8217;s leaders, <a data-i13n=\"cpos:8;pos:1\" href=\"https:\/\/www.engadget.com\/openai-co-founder-and-chief-scientist-ilya-sutskever-is-leaving-the-company-054650964.html\" data-ylk=\"slk:left the company;cpos:8;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">left the company<\/a>. Jan Leike, who was also one of the team&#8217;s leaders, quit, as well. He said in a series of tweets that he had been disagreeing with OpenAI&#8217;s leadership about the core priorities of the company for quite some time and that &#8220;safety culture and processes have taken a backseat to shiny products.&#8221; OpenAI <a data-i13n=\"cpos:9;pos:1\" href=\"https:\/\/www.engadget.com\/openais-new-safety-team-is-led-by-board-members-including-ceo-sam-altman-164927745.html\" data-ylk=\"slk:created a new safety group;cpos:9;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">created a new safety group<\/a> by the end of May, but it&#8217;s led by board members that include Altman, prompting concerns about self-policing.<\/p>\n<div class=\"twitter-tweet-wrapper\" data-embed-anchor=\"a6d23ee2-78c9-5179-81ad-297b5fe41ca3\">\n<blockquote placeholder=\"\" data-theme=\"light\" class=\"twitter-tweet\">\n<p>a few quick updates about safety at openai:<\/p>\n<p>as we said last july, we\u2019re committed to allocating at least 20% of the computing resources to safety efforts across the entire company.<\/p>\n<p>our team has been working with the US AI Safety Institute on an agreement where we would provide\u2026<\/p>\n<p>\u2014 Sam Altman (@sama) <a href=\"https:\/\/twitter.com\/sama\/status\/1818867964369928387?ref_src=twsrc%5Etfw\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:August 1, 2024;elm:context_link;itc:0;sec:content-canvas\" class=\"link \">August 1, 2024<\/a><\/p>\n<\/blockquote>\n<\/div>\n<\/div>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><br \/>\n<br \/>[ad_2]<br \/>\n<br \/><a href=\"https:\/\/www.engadget.com\/openai-vows-to-provide-the-us-government-early-access-to-its-next-ai-model-110017697.html?src=rss\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[ad_1] OpenAI will give the US AI Safety Institute early access to its next model as part of its safety efforts, Sam Altman has revealed<\/p>\n","protected":false},"author":1,"featured_media":250104,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[159],"tags":[],"_links":{"self":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/250103"}],"collection":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/comments?post=250103"}],"version-history":[{"count":0,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/250103\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media\/250104"}],"wp:attachment":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media?parent=250103"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/categories?post=250103"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/tags?post=250103"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}