I have a question about sitemaps. We have switched from static sitemaps to programmatic ones. The old (static) sitemaps were removed from GSC, robots.txt and the sitemap links return 404. The new sitemaps were submitted everywhere and fetched properly. Yet Google keeps on crawling the old sitemaps and the number of errors in GSC keeps growing. How can I make google drop the old (removed) sitemaps and crawl the new ones?

