"With the x-robots-tag set to noindex, google search console does not accept the sitemap."
I can't reproduce that. Added some sitemaps to GSC which is possible some time ago with this change.
You have to distinguish between crawling and indexing. Crawling means, that the crawler is able and allowed to retrieve the content and follow the links. This is possible for XML sitemaps with the "X-Robots-Tag: noindex" header. This is the same for a meta tag "robots" with "noindex" on a page: The page is crawled (and links followed) but the page isn't indexed (= displayed in the search results).
But: The XML sitemap should not be visible in the results of a search engine. Thus the "X-Robots-Tag: noindex". See also the link in the description of this issue.
And: If you want to allow the indexing of the XML sitemap (which means displaying in the search results) for whatever reason, you can easily achieve this by adding the following line in the TypoScript setup of your site package:
seo_sitemap.config.additionalHeaders.20 >