MyAnimeList / Kitsu / AniList / Anime-Planet exporter

Export in MAL XML format. Once exported, import it here. Official export here. Use it if you can login, instead of this (duh).


This only works for publicly accesible lists. It uses the (unofficial) endpoint used by modern lists:

It may intermittently fail, usually retrying works. The alternative is HTML scraping, but this can only be done on classic-style lists; exportable information is limited to visible columns. It's not recommended to use this.


There's a Kitsu exporter. It uses its API, e.g.[user_id]=1234&filter[kind]=anime&include=user,anime,anime.mappings&fields[anime]=canonicalTitle,episodeCount,slug,mappings&fields[mappings]=externalSite,externalId&fields[user]=name&page[limit]=500

If scraping fails, try using your user ID (0000 in; usernames on Kitsu are not unique.


There's also another AniList exporter. It uses its GraphQL API. By default, scores are always rounded down. For example, 7.5 becomes 7. Selecting the rounded option causes the scores to be rounded to the nearest integer (half up). This means that 7.5 becomes 8, while 7.4 is 7. As an exception, any scores between 0 and 1 will always may to 1, since a score of 0 is disallowed.

query ($username: String, $type: MediaType) {
  MediaListCollection(userName: $username, type: $type) {
    lists {
      entries {
        score(format: POINT_10_DECIMAL)
        startedAt { year, month, day }
        completedAt { year, month, day }
        media {
          title { romaji }


There's now a scraper for AnimePlanet - for Anime only. And no, before anyone asks, there won't be one for Manga. A single AnimePlanet entry may map to multiple MAL entries. If this occurs, there will be a comment in the XML. If no mapping can be found, this will also be commented in the XML. Entries in the "Won't Watch" category are still written out in the XML, but they are commented out, since MAL does not have such a similar category.

If an entry does not match (see the XML comments), but you know it should match a particular MAL entry, add a comment below. Same too for if an AnimePlanet entry maps to the wrong MAL entry.


Unlike all the other scrapers, this depends on an offline scrape of both MAL and AnimePlanet that maps the entries between them. Entries were mapped by comparing entry titles (or synonyms), as well as entry episode counts and start years. This includes progressively matching from the most exact matches to less certain matches based on fuzzy text matching. Because of this, there may be errors in how entries are mapped - use at your own risk. This mapping is current as of 2019/06/29, but because it is static, it will fail to map newly added entries. If anyone is particularly keen on maintaining this, reach out.

When scraping the list, it scrapes the publicly visible grid view, 560 entries at a time:

AnimePlanet avoids exposing the database id wherever it possibly can, but the database id does in fact exist, and it so happens that the grid view exposes it. This is crucial, because without this, it becomes infinitely harder to perform the mapping. Counterintuitively, the list view doesn't expose that id. Coincidentally, the 'export' that AnimePlanet offers is also garbage, because it doesn't include the database id. There is also an assumption that that database id is unique to the entry, and that ids aren't reused. Somewhat suspicious given the relatively low maximum database id, so if this assumption is false, the mapping will be broken.

There are mappings for >90% of the current entries, with most of that remainder being music video entries that MAL does not have. There are also a couple of 100 shows that were difficult to match without some serious digging.

Other notes

For both Kitsu and AniList, check the XML for comments - it will list any entries that have no equivalent MAL entry.

If you set the update_on_import checkbox, all entries will have


instead of


This will cause all entries to be overwritten by default.

Cover images CSS generation

There's a basic css generator for cover images that uses the endpoint mentioned above. See here for more information.