{"id":162,"date":"2024-05-09T12:52:31","date_gmt":"2024-05-09T12:52:31","guid":{"rendered":"https:\/\/wsw-int.de\/?p=162"},"modified":"2024-10-25T15:18:58","modified_gmt":"2024-10-25T15:18:58","slug":"xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender","status":"publish","type":"post","link":"https:\/\/multai.eu\/de\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/","title":{"rendered":"xLSTM: Die Erfinder des LSTM stellen einen Transformer-Kandidaten vor."},"content":{"rendered":"<p>Has Sepp Hochreiter done it again? After months of announcements, a group around the inventor of the LSTM finally published a <a href=\"https:\/\/arxiv.org\/pdf\/2405.04517\">paper<\/a> presenting \ud835\udc31\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c to the world.<\/p>\n\n\n\n<p>Until the appearance of the Transformer in 2017, \ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c had been the go-to technology for a wide variety of sequence-related tasks, including text generation. Three limitations<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>the inability to revise storage decisions,<\/li>\n\n\n\n<li>limited storage capacity, and<\/li>\n\n\n\n<li>the need for sequential rather than parallel processing,<\/li>\n<\/ol>\n\n\n\n<p>relegated LSTMs to second place behind Transformers.<\/p>\n\n\n\n<p>The group proposes two types of new LSTM memory cells, baptized \ud835\udc2c\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c and \ud835\udc26\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c (see the graph from the original paper below). Both sLSTM and mLSTM are placed in a residual block (adding skip connections, similar to Transformers). These blocks can be stacked in various combinations, and thus constitute the complete \ud835\udc31\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c architecture.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img fetchpriority=\"high\" decoding=\"async\" width=\"1024\" height=\"624\" src=\"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png\" alt=\"\" class=\"wp-image-163\" srcset=\"https:\/\/multai.eu\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png 1024w, https:\/\/multai.eu\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-300x183.png 300w, https:\/\/multai.eu\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-768x468.png 768w, https:\/\/multai.eu\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1536x936.png 1536w, https:\/\/multai.eu\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-2048x1248.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><figcaption class=\"wp-element-caption\">From the original paper.<\/figcaption><\/figure>\n\n\n\n<ul class=\"wp-block-list\">\n<li>The input and forget gates in the \ud835\udc2c\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c cell get exponential gates, equipping it with the ability to \ud835\udc2b\ud835\udc1e\ud835\udc2f\ud835\udc22\ud835\udc2c\ud835\udc1e \ud835\udc2c\ud835\udc2d\ud835\udc28\ud835\udc2b\ud835\udc1a\ud835\udc20\ud835\udc1e \ud835\udc1d\ud835\udc1e\ud835\udc1c\ud835\udc22\ud835\udc2c\ud835\udc22\ud835\udc28\ud835\udc27\ud835\udc2c. Identical to normal LSTMs, sLSTM can have multiple cells (one for every sequence step), allowing memory mixing. sLSTM, however, can also have multiple heads, again a Transformer idea injected into LSTMs. Memory mixing across heads is not possible.<\/li>\n\n\n\n<li>Instead of a scalar, the \ud835\udc26\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c memory cell is a matrix for \ud835\udc1e\ud835\udc27\ud835\udc21\ud835\udc1a\ud835\udc27\ud835\udc1c\ud835\udc1e\ud835\udc1d \ud835\udc2c\ud835\udc2d\ud835\udc28\ud835\udc2b\ud835\udc1a\ud835\udc20\ud835\udc1e \ud835\udc1c\ud835\udc1a\ud835\udc29\ud835\udc1a\ud835\udc1c\ud835\udc22\ud835\udc2d\ud835\udc32. For retrieval, mLSTM adapts the key, value, and query vectors concept from Transformers. Consequently, there is no memory mixing, but multiple heads are possible here as well.<\/li>\n\n\n\n<li>The memory mixing in sLSTM requires \ud835\udc2c\ud835\udc1e\ud835\udc2a\ud835\udc2e\ud835\udc1e\ud835\udc27\ud835\udc2d\ud835\udc22\ud835\udc1a\ud835\udc25 \ud835\udc1c\ud835\udc1a\ud835\udc25\ud835\udc1c\ud835\udc2e\ud835\udc25\ud835\udc1a\ud835\udc2d\ud835\udc22\ud835\udc28\ud835\udc27\ud835\udc2c, ruling out parallelization. Sepp Hochreiter\u2019s team does propose a fast CUDA kernel, but the speed handicap remains.<\/li>\n<\/ul>\n\n\n\n<p>In the experimental section, xLSTM is pitched against other methods, most notably Transformers. Overall, \ud835\udc31\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c \ud835\udc1c\ud835\udc28\ud835\udc26\ud835\udc29\ud835\udc1a\ud835\udc2b\ud835\udc1e\ud835\udc2c \ud835\udc1f\ud835\udc1a\ud835\udc2f\ud835\udc28\ud835\udc2b\ud835\udc1a\ud835\udc1b\ud835\udc25\ud835\udc32 in many tasks, including Large Language Model. Ablation studies show that both components, sLSTM and mLSTM, contribute to the improvement over regular LSTM. An important observation is also that xLSTM scales well, so their performance is not limited to smaller datasets. Speed tests are not shown.<\/p>\n\n\n\n<p>It will be interesting to observe to what extent xLSTM will gain traction, and what, in business speak, its key success factors will be.<\/p>\n\n\n\n<p><a href=\"https:\/\/multai.eu\/de\/\">MultAI.eu<\/a> &#8230;<\/p>","protected":false},"excerpt":{"rendered":"<p>Has Sepp Hochreiter done it again? After months of announcements, a group around the inventor of the LSTM finally published a paper presenting \ud835\udc31\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c to the world. Until the appearance of the Transformer in 2017, \ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c had been the go-to technology for a wide variety of sequence-related tasks, including text generation. Three limitations relegated LSTMs [&hellip;]<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[42,45,47,41,44,46,43],"class_list":["post-162","post","type-post","status-publish","format-standard","hentry","category-uncategorized","tag-lstm","tag-mlstm","tag-residual-block","tag-sepp-hochreiter","tag-slstm","tag-transformer","tag-xlsmt"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v22.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>xLSTM: The inventors of LSTM are presenting a Transformer contender. - MultAI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/multai.eu\/de\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\" \/>\n<meta property=\"og:locale\" content=\"de_DE\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"xLSTM: The inventors of LSTM are presenting a Transformer contender. - MultAI\" \/>\n<meta property=\"og:description\" content=\"Has Sepp Hochreiter done it again? After months of announcements, a group around the inventor of the LSTM finally published a paper presenting \ud835\udc31\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c to the world. Until the appearance of the Transformer in 2017, \ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c had been the go-to technology for a wide variety of sequence-related tasks, including text generation. Three limitations relegated LSTMs [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/multai.eu\/de\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\" \/>\n<meta property=\"og:site_name\" content=\"MultAI\" \/>\n<meta property=\"article:published_time\" content=\"2024-05-09T12:52:31+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-10-25T15:18:58+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png\" \/>\n<meta name=\"author\" content=\"hans\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Verfasst von\" \/>\n\t<meta name=\"twitter:data1\" content=\"hans\" \/>\n\t<meta name=\"twitter:label2\" content=\"Gesch\u00e4tzte Lesezeit\" \/>\n\t<meta name=\"twitter:data2\" content=\"2\u00a0Minuten\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\"},\"author\":{\"name\":\"hans\",\"@id\":\"https:\/\/multai.eu\/#\/schema\/person\/06def8c374b5d6724bec911e9880c292\"},\"headline\":\"xLSTM: The inventors of LSTM are presenting a Transformer contender.\",\"datePublished\":\"2024-05-09T12:52:31+00:00\",\"dateModified\":\"2024-10-25T15:18:58+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\"},\"wordCount\":342,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/multai.eu\/#organization\"},\"image\":{\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png\",\"keywords\":[\"LSTM\",\"mLSTM\",\"residual block\",\"Sepp Hochreiter\",\"sLSTM\",\"Transformer\",\"xLSMT\"],\"articleSection\":[\"Uncategorized\"],\"inLanguage\":\"de\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\",\"url\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\",\"name\":\"xLSTM: The inventors of LSTM are presenting a Transformer contender. - MultAI\",\"isPartOf\":{\"@id\":\"https:\/\/multai.eu\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png\",\"datePublished\":\"2024-05-09T12:52:31+00:00\",\"dateModified\":\"2024-10-25T15:18:58+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#breadcrumb\"},\"inLanguage\":\"de\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"de\",\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage\",\"url\":\"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png\",\"contentUrl\":\"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/multai.eu\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"xLSTM: The inventors of LSTM are presenting a Transformer contender.\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/multai.eu\/#website\",\"url\":\"https:\/\/multai.eu\/\",\"name\":\"WSW\",\"description\":\"Generative AI for your business\",\"publisher\":{\"@id\":\"https:\/\/multai.eu\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/multai.eu\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"de\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/multai.eu\/#organization\",\"name\":\"WSW\",\"alternateName\":\"MultAI\",\"url\":\"https:\/\/multai.eu\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"de\",\"@id\":\"https:\/\/multai.eu\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/multai.eu\/wp-content\/uploads\/2024\/10\/Logo.png\",\"contentUrl\":\"https:\/\/multai.eu\/wp-content\/uploads\/2024\/10\/Logo.png\",\"width\":225,\"height\":244,\"caption\":\"WSW\"},\"image\":{\"@id\":\"https:\/\/multai.eu\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/multai.eu\/#\/schema\/person\/06def8c374b5d6724bec911e9880c292\",\"name\":\"hans\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"de\",\"@id\":\"https:\/\/multai.eu\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/1409f6643b6f17d5838709af9deca41643884a95390f8a4f8ea478b9187aec41?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/1409f6643b6f17d5838709af9deca41643884a95390f8a4f8ea478b9187aec41?s=96&d=mm&r=g\",\"caption\":\"hans\"},\"sameAs\":[\"https:\/\/wsw-int.de\"],\"url\":\"https:\/\/multai.eu\/de\/author\/hans\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"xLSTM: The inventors of LSTM are presenting a Transformer contender. - MultAI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/multai.eu\/de\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/","og_locale":"de_DE","og_type":"article","og_title":"xLSTM: The inventors of LSTM are presenting a Transformer contender. - MultAI","og_description":"Has Sepp Hochreiter done it again? After months of announcements, a group around the inventor of the LSTM finally published a paper presenting \ud835\udc31\ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c to the world. Until the appearance of the Transformer in 2017, \ud835\udc0b\ud835\udc12\ud835\udc13\ud835\udc0c had been the go-to technology for a wide variety of sequence-related tasks, including text generation. Three limitations relegated LSTMs [&hellip;]","og_url":"https:\/\/multai.eu\/de\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/","og_site_name":"MultAI","article_published_time":"2024-05-09T12:52:31+00:00","article_modified_time":"2024-10-25T15:18:58+00:00","og_image":[{"url":"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png"}],"author":"hans","twitter_card":"summary_large_image","twitter_misc":{"Verfasst von":"hans","Gesch\u00e4tzte Lesezeit":"2\u00a0Minuten"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#article","isPartOf":{"@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/"},"author":{"name":"hans","@id":"https:\/\/multai.eu\/#\/schema\/person\/06def8c374b5d6724bec911e9880c292"},"headline":"xLSTM: The inventors of LSTM are presenting a Transformer contender.","datePublished":"2024-05-09T12:52:31+00:00","dateModified":"2024-10-25T15:18:58+00:00","mainEntityOfPage":{"@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/"},"wordCount":342,"commentCount":0,"publisher":{"@id":"https:\/\/multai.eu\/#organization"},"image":{"@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage"},"thumbnailUrl":"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png","keywords":["LSTM","mLSTM","residual block","Sepp Hochreiter","sLSTM","Transformer","xLSMT"],"articleSection":["Uncategorized"],"inLanguage":"de","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/","url":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/","name":"xLSTM: The inventors of LSTM are presenting a Transformer contender. - MultAI","isPartOf":{"@id":"https:\/\/multai.eu\/#website"},"primaryImageOfPage":{"@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage"},"image":{"@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage"},"thumbnailUrl":"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png","datePublished":"2024-05-09T12:52:31+00:00","dateModified":"2024-10-25T15:18:58+00:00","breadcrumb":{"@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#breadcrumb"},"inLanguage":"de","potentialAction":[{"@type":"ReadAction","target":["https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/"]}]},{"@type":"ImageObject","inLanguage":"de","@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#primaryimage","url":"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png","contentUrl":"https:\/\/wsw-int.de\/wp-content\/uploads\/2024\/05\/Screenshot-2024-05-09-143748-1024x624.png"},{"@type":"BreadcrumbList","@id":"https:\/\/multai.eu\/xlstm-the-inventors-of-lstm-are-presenting-a-transformer-contender\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/multai.eu\/"},{"@type":"ListItem","position":2,"name":"xLSTM: The inventors of LSTM are presenting a Transformer contender."}]},{"@type":"WebSite","@id":"https:\/\/multai.eu\/#website","url":"https:\/\/multai.eu\/","name":"WSW","description":"Generative AI for your business","publisher":{"@id":"https:\/\/multai.eu\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/multai.eu\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"de"},{"@type":"Organization","@id":"https:\/\/multai.eu\/#organization","name":"WSW","alternateName":"MultAI","url":"https:\/\/multai.eu\/","logo":{"@type":"ImageObject","inLanguage":"de","@id":"https:\/\/multai.eu\/#\/schema\/logo\/image\/","url":"https:\/\/multai.eu\/wp-content\/uploads\/2024\/10\/Logo.png","contentUrl":"https:\/\/multai.eu\/wp-content\/uploads\/2024\/10\/Logo.png","width":225,"height":244,"caption":"WSW"},"image":{"@id":"https:\/\/multai.eu\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/multai.eu\/#\/schema\/person\/06def8c374b5d6724bec911e9880c292","name":"hans","image":{"@type":"ImageObject","inLanguage":"de","@id":"https:\/\/multai.eu\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/1409f6643b6f17d5838709af9deca41643884a95390f8a4f8ea478b9187aec41?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/1409f6643b6f17d5838709af9deca41643884a95390f8a4f8ea478b9187aec41?s=96&d=mm&r=g","caption":"hans"},"sameAs":["https:\/\/wsw-int.de"],"url":"https:\/\/multai.eu\/de\/author\/hans\/"}]}},"_links":{"self":[{"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/posts\/162","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/comments?post=162"}],"version-history":[{"count":2,"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/posts\/162\/revisions"}],"predecessor-version":[{"id":1432,"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/posts\/162\/revisions\/1432"}],"wp:attachment":[{"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/media?parent=162"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/categories?post=162"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/multai.eu\/de\/wp-json\/wp\/v2\/tags?post=162"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}