mirror of
https://github.com/placeAtlas/atlas.git
synced 2025-01-17 01:12:23 +01:00
Merge pull request #1435 from Hans5958/live-love-life/4
Various "minor" changes (4)
This commit is contained in:
commit
5992be6391
25 changed files with 662 additions and 213 deletions
18
.github/workflows/validate-json.yml
vendored
18
.github/workflows/validate-json.yml
vendored
|
@ -1,4 +1,4 @@
|
|||
name: Validate JSON
|
||||
name: Validate Atlas data
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
|
@ -8,10 +8,20 @@ on:
|
|||
- web/atlas.json
|
||||
jobs:
|
||||
validate:
|
||||
name: Validate JSON
|
||||
name: Validate
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v3
|
||||
- name: Validate JSON
|
||||
run: python3 tools/ci/validate_json.py web/atlas.json
|
||||
- name: Cache dependencies
|
||||
uses: actions/cache@v3
|
||||
with:
|
||||
path: ~/.cache/pip
|
||||
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-pip-
|
||||
- name: Validate
|
||||
run: |
|
||||
pip3 install -r tools/ci/requirements.txt
|
||||
python3 tools/ci/validate_json.py web/atlas.json tools/schema/atlas.json
|
||||
python3 tools/ci/validate_json.py data/patches tools/schema/patch.json
|
|
@ -6,8 +6,6 @@ You may contribute to the project by submitting a Pull Request on the GitHub rep
|
|||
|
||||
## New Atlas entries
|
||||
|
||||
> **Warning**: **WE ONLY ACCEPT NEW ENTRIES ON REDDIT!**
|
||||
|
||||
To contribute to the map, we require a certain format for artwork region and labels. This can be generated on [the drawing mode](https://place-atlas.stefanocoding.me?mode=draw) on the website.
|
||||
|
||||
To add a new entry, go to [the drawing mode](https://place-atlas.stefanocoding.me?mode=draw) and draw a shape/polygon around the region you'd like to describe. You can use the <kbd>Undo</kbd>, <kbd>Redo</kbd>, and <kbd>Reset</kbd> buttons to help you creating a good polygon. Make sure that the lines you're drawing don't form a [self-intersecting polygon](https://upload.wikimedia.org/wikipedia/commons/thumb/0/0f/Complex_polygon.svg/288px-Complex_polygon.svg.png).
|
||||
|
@ -26,34 +24,59 @@ When you're happy with the shape you've drawn, press <kbd>Finish</kbd>. You will
|
|||
|
||||
All fields but the name are optional. For example, a country flag doesn't necessarily need a description.
|
||||
|
||||
Once you've entered all the information, you'll be presented with a pop-up window containing some [JSON](https://en.wikipedia.org/wiki/JSON)-formatted data. You can press the <kbd>Post Direct to Reddit</kbd> button and just press the send button on Reddit, or copy the entire JSON text and [create a new text post on the subreddit](https://www.reddit.com/r/placeAtlas2/submit). You don't need to add any other text; just directly send the data.
|
||||
Once you've entered all the information, you'll be presented with a pop-up window containing some [JSON](https://en.wikipedia.org/wiki/JSON)-formatted submission data. Depending on the method, there are two preferred methods.
|
||||
|
||||
### Through Reddit
|
||||
|
||||
You can press the <kbd>Post Direct to Reddit</kbd> button and just press the send button on Reddit, or copy the entire JSON text and [create a new text post on the subreddit](https://www.reddit.com/r/placeAtlas2/submit). You don't need to add any other text; just directly send the data.
|
||||
|
||||
Remember to flair your post with <kbd>New Entry</kbd>. On New Reddit, click the <kbd>Flair</kbd> button on the bottom part, and select <kbd>New Entry</kbd>. On Old Reddit, click the <kbd>select</kbd> button on the "choose a flair" section instead.
|
||||
|
||||
### Through GitHub
|
||||
|
||||
If you know about Git and how to create a pull request on GitHub, you can try create a patch that will be merged, along with other patches, by one of the members.
|
||||
|
||||
You can use the provided `tools/create_patch.py` script. This script helps you to create a working patch, along with additional data such as your name for attribution sakes. Simply run the script inside the `tools/` folder and follow the given instructions.
|
||||
|
||||
If you want to do this manually (e.g. you don't have Python), you can create a patch by creating a `.json` file inside `data/patches`, with the content of the JSON-formatted data that is given earlier. You may add attribution by adding a `_author` key with the value of your Reddit username or your GitHub username plus a `gh:` prefix.
|
||||
|
||||
```json5
|
||||
{
|
||||
"id": 0,
|
||||
// ...
|
||||
"_author": "Hans5958_",
|
||||
// or...
|
||||
"_author": "gh:Hans5958",
|
||||
}
|
||||
```
|
||||
|
||||
Once you have successfully created the patch, the file can be committed, and a pull request towards the `cleanup` branch can be created. A member will merge the pull request if it is adequate.
|
||||
|
||||
## Edits to Atlas entries
|
||||
|
||||
Other than adding new ones, you can edit existing atlas entries.
|
||||
|
||||
### Using the web interface
|
||||
|
||||
You can use the website to edit single entries easily. On the website, click <kbd>Edit</kbd> on an entry box. Afterwards, you are now on the drawing mode, editing the entry, in which you can follow the same instructions as [when creating a new entry](#new-atlas-entries). Upon submitting, please flair it as <kbd>Edit Entry</kbd> instead.
|
||||
You can use the website to edit single entries easily. On the website, click <kbd>Edit</kbd> on an entry box. Afterwards, you are now on the drawing mode, editing the entry, in which you can follow the same instructions as [when creating a new entry](#new-atlas-entries).
|
||||
|
||||
As an alternative, you can also submit an issue on GitHub using [this form](https://github.com/placeAtlas/atlas/issues/new?assignees=&labels=entry+update&template=edit-entry.yml).
|
||||
Upon submitting, if you use Reddit, please flair it as <kbd>Edit Entry</kbd> instead. The method stays the same if you use GitHub.
|
||||
|
||||
As an alternative, you can also submit an issue on GitHub using [this form](https://github.com/placeAtlas/atlas/issues/new?assignees=&labels=entry+update&template=edit-entry.yml) or report it on our Discord server.
|
||||
|
||||
### Manually
|
||||
|
||||
Edits are also welcome on this repository through GitHub. You may use GitHub for bulk or large-scale changes, such as removing duplicates.
|
||||
Edits are also welcome on this repository using Git through GitHub. You may use Git or GitHub for bulk or large-scale changes, such as removing duplicates.
|
||||
|
||||
`web/atlas.json` is where the Atlas data is located, in which you can edit on GitHub. Below is an example of an entry. The example has been expanded, but please save it in the way so each line is an entry which is minified.
|
||||
`web/atlas.json` is where the Atlas data is located, in which you can edit on GitHub. The next section includes an example of an entry.
|
||||
|
||||
Upon creating a fork of this repository and pushing the changes, create a Pull Request against the `cleanup` branch. A member will merge the pull request if it is adequate.
|
||||
Upon creating a fork of this repository and pushing the changes, create a pull request towards the `cleanup` branch. A member will merge the pull request if it is adequate.
|
||||
|
||||
To help find duplicates, [use the Overlap mode](https://place-atlas.stefanocoding.me?mode=overlap).
|
||||
|
||||
|
||||
### Example
|
||||
|
||||
Hereforth is an example of the structured data.
|
||||
Hereforth is an example of the structured data. The example has been expanded, but please save it in the way so each line is an entry which is minified. The `aformatter.py` script can help you with this.
|
||||
|
||||
```json5
|
||||
{
|
||||
|
|
|
@ -1,10 +1,12 @@
|
|||
#!/usr/bin/python
|
||||
|
||||
from io import TextIOWrapper
|
||||
from typing import List
|
||||
import re
|
||||
import json
|
||||
import math
|
||||
import traceback
|
||||
from typing import List
|
||||
import tqdm
|
||||
|
||||
END_NORMAL_IMAGE = "164"
|
||||
END_WHITEOUT_IMAGE = "166"
|
||||
|
@ -302,7 +304,6 @@ def floor_points(entry: dict):
|
|||
|
||||
return entry
|
||||
|
||||
|
||||
def validate(entry: dict):
|
||||
"""
|
||||
Validates the entry. Catch errors and tell warnings related to the entry.
|
||||
|
@ -339,16 +340,17 @@ def validate(entry: dict):
|
|||
print(f"{key} of entry {entry['id']} is still invalid! {entry[key]}")
|
||||
return return_status
|
||||
|
||||
def per_line_entries(entries: list):
|
||||
def per_line_entries(entries: list, file: TextIOWrapper):
|
||||
"""
|
||||
Returns a string of all the entries, with every entry in one line.
|
||||
"""
|
||||
out = "[\n"
|
||||
for entry in entries:
|
||||
if entry:
|
||||
out += json.dumps(entry, ensure_ascii=False) + ",\n"
|
||||
out = out[:-2] + "\n]"
|
||||
return out
|
||||
file.write("[\n")
|
||||
line_temp = ""
|
||||
for entry in tqdm.tqdm(entries):
|
||||
if line_temp:
|
||||
file.write(line_temp + ",\n")
|
||||
line_temp = json.dumps(entry, ensure_ascii=False)
|
||||
file.write(line_temp + "\n]")
|
||||
|
||||
def format_all(entry: dict, silent=False):
|
||||
"""
|
||||
|
@ -387,7 +389,7 @@ def format_all(entry: dict, silent=False):
|
|||
return entry
|
||||
|
||||
def format_all_entries(entries):
|
||||
for i in range(len(entries)):
|
||||
for i in tqdm.trange(len(entries)):
|
||||
try:
|
||||
entry_formatted = format_all(entries[i], True)
|
||||
validation_status = validate(entries[i])
|
||||
|
@ -399,8 +401,6 @@ def format_all_entries(entries):
|
|||
except Exception:
|
||||
print(f"Exception occured when formatting ID {entries[i]['id']}")
|
||||
print(traceback.format_exc())
|
||||
if not (i % 200):
|
||||
print(f"{i} checked.")
|
||||
|
||||
def go(path):
|
||||
|
||||
|
@ -411,10 +411,10 @@ def go(path):
|
|||
|
||||
format_all_entries(entries)
|
||||
|
||||
print(f"{len(entries)} checked. Writing...")
|
||||
print(f"Writing...")
|
||||
|
||||
with open(path, "w", encoding='utf-8', newline='\n') as f2:
|
||||
f2.write(per_line_entries(entries))
|
||||
per_line_entries(entries, f2)
|
||||
|
||||
print("Writing completed. All done.")
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@ rm -rf .parcel-cache
|
|||
cp -r web/ dist-temp/
|
||||
|
||||
npm i
|
||||
python tools/ci/cdn-to-local.py
|
||||
python tools/ci/cdn_to_local.py
|
||||
npx parcel build dist-temp/index.html dist-temp/**.html --dist-dir "dist" --no-source-maps --no-content-hash
|
||||
|
||||
rm -rf dist-temp
|
||||
|
|
1
tools/ci/requirements.txt
Normal file
1
tools/ci/requirements.txt
Normal file
|
@ -0,0 +1 @@
|
|||
jsonschema
|
|
@ -2,13 +2,38 @@
|
|||
|
||||
import sys
|
||||
import json
|
||||
from jsonschema import validate, RefResolver
|
||||
from pathlib import Path, PurePosixPath
|
||||
import os
|
||||
|
||||
path = "./../../web/atlas.json"
|
||||
instance_path = "../../web/atlas.json"
|
||||
|
||||
# path override as 1st param: validate_json.py path_to_file.json
|
||||
if (len(sys.argv) > 1):
|
||||
path = sys.argv[1]
|
||||
instance_path = sys.argv[1]
|
||||
|
||||
json.load(open(path, "r", encoding='utf-8'))
|
||||
schema_path = "../schema/atlas.json"
|
||||
|
||||
# schema override as 2nd param: validate_json.py [...] path_to_schema.json
|
||||
if (len(sys.argv) > 2):
|
||||
schema_path = sys.argv[2]
|
||||
|
||||
relative_path = "file:" + str(PurePosixPath(Path(os.getcwd(), schema_path)))
|
||||
|
||||
schema = json.load(open(schema_path, "r", encoding='utf-8'))
|
||||
# exit()
|
||||
|
||||
resolver = RefResolver(relative_path, schema)
|
||||
if os.path.isdir(instance_path):
|
||||
for filename in os.listdir(instance_path):
|
||||
f = os.path.join(instance_path, filename)
|
||||
print(f)
|
||||
|
||||
instance = json.load(open(f, "r", encoding='utf-8'))
|
||||
validate(instance, schema, resolver=resolver)
|
||||
elif os.path.isfile(instance_path):
|
||||
print(instance_path)
|
||||
instance = json.load(open(instance_path, "r", encoding='utf-8'))
|
||||
validate(instance, schema, resolver=resolver)
|
||||
|
||||
print("JSON is valid")
|
37
tools/create_patch.py
Normal file
37
tools/create_patch.py
Normal file
|
@ -0,0 +1,37 @@
|
|||
import json
|
||||
import os
|
||||
import secrets
|
||||
from pathlib import Path
|
||||
|
||||
patches_dir = "../data/patches/"
|
||||
Path(patches_dir).mkdir(parents=True, exist_ok=True)
|
||||
|
||||
entry = None
|
||||
entry_input = ""
|
||||
|
||||
print("Write/paste your JSON-formatted submission data here.")
|
||||
while entry is None:
|
||||
|
||||
entry_input += input("> ")
|
||||
try:
|
||||
entry = json.loads(entry_input)
|
||||
except:
|
||||
pass
|
||||
print()
|
||||
print("Submission is valid!")
|
||||
print()
|
||||
print("Enter your username as the attribution to be shown on the About page.")
|
||||
print("Leave it empty if you don't want to be attributed.")
|
||||
print("You can use your Reddit username. Do not include the \"u/\" part.")
|
||||
print("You can also your GitHub username, but add \"gh:\" before your username (e.g. \"gh:octocat\")")
|
||||
author = input("Author: ")
|
||||
|
||||
if author:
|
||||
entry['_author'] = author
|
||||
|
||||
filename = f'gh-{secrets.token_hex(2)}-{"-".join(entry["name"].split()).lower()}.json'
|
||||
with open(f'{patches_dir}gh-{secrets.token_hex(2)}-{"-".join(entry["name"].split()).lower()}.json', 'w', encoding='utf-8') as out_file:
|
||||
out_file.write(json.dumps(entry, ensure_ascii=False))
|
||||
|
||||
print("Patch created as " + filename + "!")
|
||||
print("You can commit the created file directly, to which you can push and create a pull request after that.")
|
|
@ -2,90 +2,108 @@ import json
|
|||
import os
|
||||
from aformatter import format_all_entries, per_line_entries
|
||||
import scale_back
|
||||
import traceback
|
||||
|
||||
from scale_back import ScaleConfig
|
||||
|
||||
merge_source_file = 'temp-atlas.json'
|
||||
|
||||
with open(merge_source_file, 'r', encoding='UTF-8') as f1:
|
||||
out_json = json.loads(f1.read())
|
||||
|
||||
format_all_entries(out_json)
|
||||
|
||||
base_image_path = os.path.join('..', 'web', '_img', 'canvas', 'place30')
|
||||
ScaleConfig.image1 = os.path.join(base_image_path, '159.png')
|
||||
scale_back.swap_source_dest('164', '165', os.path.join(base_image_path, '163_159.png'))
|
||||
scale_back.scale_back_entries(out_json)
|
||||
scale_back.swap_source_dest('165', '166', os.path.join(base_image_path, '164_159.png'))
|
||||
scale_back.scale_back_entries(out_json)
|
||||
scale_back.swap_source_dest('166', '167', os.path.join(base_image_path, '165_159.png'))
|
||||
scale_back.scale_back_entries(out_json)
|
||||
|
||||
out_ids = set()
|
||||
out_dupe_ids = set()
|
||||
out_ids = []
|
||||
atlas_ids = {}
|
||||
authors = []
|
||||
|
||||
with open('../web/all-authors.txt', 'r', encoding='utf-8') as authors_file:
|
||||
authors = authors_file.read().strip().split()
|
||||
|
||||
with open('../data/read-ids.txt', 'r', encoding='utf-8') as ids_file:
|
||||
out_ids = ids_file.read().strip().split()
|
||||
|
||||
with open('../web/atlas.json', 'r', encoding='utf-8') as atlas_file:
|
||||
atlas_json = json.loads(atlas_file.read())
|
||||
atlas_data = json.loads(atlas_file.read())
|
||||
|
||||
for i, entry in enumerate(atlas_json):
|
||||
# format_all_entries(atlas_data)
|
||||
|
||||
# base_image_path = os.path.join('..', 'web', '_img', 'canvas', 'place30')
|
||||
# ScaleConfig.image1 = os.path.join(base_image_path, '159.png')
|
||||
# scale_back.swap_source_dest('164', '165', os.path.join(base_image_path, '163_159.png'))
|
||||
# scale_back.scale_back_entries(atlas_data)
|
||||
# scale_back.swap_source_dest('165', '166', os.path.join(base_image_path, '164_159.png'))
|
||||
# scale_back.scale_back_entries(atlas_data)
|
||||
# scale_back.swap_source_dest('166', '167', os.path.join(base_image_path, '165_159.png'))
|
||||
# scale_back.scale_back_entries(atlas_data)
|
||||
|
||||
last_id = 0
|
||||
|
||||
for i, entry in enumerate(atlas_data):
|
||||
atlas_ids[entry['id']] = i
|
||||
id = entry['id']
|
||||
if id.isnumeric() and int(id) > last_id and int(id) - last_id < 100:
|
||||
last_id = int(id)
|
||||
|
||||
last_existing_id = list(atlas_json[-1]['id'])
|
||||
patches_dir = "../data/patches/"
|
||||
if not os.path.exists(patches_dir):
|
||||
print("Patches folder not found. Exiting.")
|
||||
exit()
|
||||
|
||||
for filename in os.listdir(patches_dir):
|
||||
f = os.path.join(patches_dir, filename)
|
||||
|
||||
print(f"{filename}: Processing...")
|
||||
|
||||
for entry in out_json:
|
||||
if entry['id'] == 0 or entry['id'] == '0':
|
||||
# "Increment" the last ID to derive a new ID.
|
||||
current_index = -1
|
||||
while current_index > -(len(last_existing_id)):
|
||||
current_char = last_existing_id[current_index]
|
||||
|
||||
if current_char == 'z':
|
||||
last_existing_id[current_index] = '0'
|
||||
current_index -= 1
|
||||
else:
|
||||
if current_char == '9':
|
||||
current_char = 'a'
|
||||
else:
|
||||
current_char = chr(ord(current_char) + 1)
|
||||
last_existing_id[current_index] = current_char
|
||||
break
|
||||
entry['id'] = ''.join(last_existing_id)
|
||||
|
||||
for entry in out_json:
|
||||
if entry['id'] in out_ids:
|
||||
print(f"Entry {entry['id']} has duplicates! Please resolve this conflict. This will be excluded from the merge.")
|
||||
out_dupe_ids.add(entry['id'])
|
||||
out_ids.add(entry['id'])
|
||||
|
||||
for entry in out_json:
|
||||
if entry['id'] in out_dupe_ids:
|
||||
if not os.path.isfile(f) or not f.endswith('json'):
|
||||
continue
|
||||
|
||||
if 'edit' in entry and entry['edit']:
|
||||
assert entry['id'] in atlas_ids, "Edit failed! ID not found on Atlas."
|
||||
index = atlas_ids[entry['id']]
|
||||
try:
|
||||
with open(f, 'r', encoding='utf-8') as entry_file:
|
||||
entry = json.loads(entry_file.read())
|
||||
|
||||
assert index != None, "Edit failed! ID not found on Atlas."
|
||||
if '_reddit_id' in entry:
|
||||
reddit_id = entry['_reddit_id']
|
||||
if reddit_id in out_ids:
|
||||
print(f"{filename}: Submission from {entry['id']} has been included! This will be ignored from the merge.")
|
||||
continue
|
||||
out_ids.append(reddit_id)
|
||||
del entry['_reddit_id']
|
||||
|
||||
print(f"Edited {atlas_json[index]['id']} with {entry['edit']}")
|
||||
# This wouldn't work if it is an edit.
|
||||
# If needed, we can add a type to the patch to be more foolproof.
|
||||
# if entry['id'] in out_ids:
|
||||
# print(f"{filename}: Submission from {entry['id']} has been included! This will be ignored from the merge.")
|
||||
# continue
|
||||
|
||||
del entry['edit']
|
||||
atlas_json[index] = entry
|
||||
elif entry['id'] in atlas_ids:
|
||||
print(f"Edited {entry['id']} manually.")
|
||||
atlas_json[atlas_ids[entry['id']]] = entry
|
||||
else:
|
||||
print(f"Added {entry['id']}.")
|
||||
atlas_json.append(entry)
|
||||
if '_author' in entry:
|
||||
author = entry['_author']
|
||||
if author not in authors:
|
||||
authors.append(author)
|
||||
del entry['_author']
|
||||
|
||||
if entry['id'] is int and entry['id'] < 1:
|
||||
last_id += 1
|
||||
print(f"{filename}: Entry is new, assigned ID {last_id}")
|
||||
entry['id'] = str(last_id)
|
||||
elif entry['id'] not in out_ids:
|
||||
out_ids.append(entry['id'])
|
||||
|
||||
if entry['id'] in atlas_ids:
|
||||
index = atlas_ids[entry['id']]
|
||||
print(f"{filename}: Edited {atlas_data[index]['id']}.")
|
||||
atlas_data[index] = entry
|
||||
else:
|
||||
print(f"{filename}: Added {entry['id']}.")
|
||||
atlas_data.append(entry)
|
||||
|
||||
os.remove(f)
|
||||
|
||||
except:
|
||||
print(f"{filename}: Something went wrong; patch couldn't be implemented. Skipping.")
|
||||
traceback.print_exc()
|
||||
|
||||
print('Writing...')
|
||||
with open('../web/atlas.json', 'w', encoding='utf-8') as atlas_file:
|
||||
atlas_file.write(per_line_entries(atlas_json))
|
||||
per_line_entries(atlas_data, atlas_file)
|
||||
|
||||
with open('../data/read-ids.txt', 'a', encoding='utf-8') as read_ids_file:
|
||||
with open('temp-read-ids.txt', 'r+', encoding='utf-8') as read_ids_temp_file:
|
||||
read_ids_file.writelines(read_ids_temp_file.readlines())
|
||||
read_ids_temp_file.truncate(0)
|
||||
with open('../data/read-ids.txt', 'w', encoding='utf-8') as ids_file:
|
||||
ids_file.write("\n".join(out_ids) + "\n")
|
||||
|
||||
with open('../web/all-authors.txt', 'w', encoding='utf-8') as authors_file:
|
||||
authors_file.write("\n".join(authors) + "\n")
|
||||
|
||||
print('All done.')
|
|
@ -7,8 +7,10 @@ Migrator script from old atlas format to remastered atlas format.
|
|||
- submitted_by removed
|
||||
"""
|
||||
|
||||
from io import TextIOWrapper
|
||||
import re
|
||||
import json
|
||||
import tqdm
|
||||
|
||||
END_IMAGE = 166
|
||||
INIT_CANVAS_RANGE = (1, END_IMAGE)
|
||||
|
@ -73,16 +75,17 @@ def migrate_atlas_format(entry: dict):
|
|||
|
||||
return toreturn
|
||||
|
||||
def per_line_entries(entries: list):
|
||||
def per_line_entries(entries: list, file: TextIOWrapper):
|
||||
"""
|
||||
Returns a string of all the entries, with every entry in one line.
|
||||
"""
|
||||
out = "[\n"
|
||||
for entry in entries:
|
||||
if entry:
|
||||
out += json.dumps(entry, ensure_ascii=False) + ",\n"
|
||||
out = out[:-2] + "\n]"
|
||||
return out
|
||||
file.write("[\n")
|
||||
line_temp = ""
|
||||
for entry in tqdm.tqdm(entries):
|
||||
if line_temp:
|
||||
file.write(line_temp + ",\n")
|
||||
line_temp = json.dumps(entry, ensure_ascii=False)
|
||||
file.write(line_temp + "\n]")
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
|
@ -93,16 +96,14 @@ if __name__ == '__main__':
|
|||
with open(path, "r+", encoding='UTF-8') as f1:
|
||||
entries = json.loads(f1.read())
|
||||
|
||||
for i in range(len(entries)):
|
||||
for i in tqdm.trange(len(entries)):
|
||||
entry_formatted = migrate_atlas_format(entries[i])
|
||||
entries[i] = entry_formatted
|
||||
if not (i % 1000):
|
||||
print(f"{i} checked.")
|
||||
|
||||
print(f"{len(entries)} checked. Writing...")
|
||||
|
||||
with open(path, "w", encoding='utf-8', newline='\n') as f2:
|
||||
f2.write(per_line_entries(entries))
|
||||
per_line_entries(entries, f2)
|
||||
|
||||
print("Writing completed. All done.")
|
||||
|
||||
|
|
|
@ -17,74 +17,90 @@ Running:
|
|||
1. Run the script
|
||||
2. Input the next ID to use
|
||||
3. Manually resolve errors in temp-atlas-manual.json
|
||||
4 a. Use merge_out.py, or...
|
||||
b. a. Copy temp-atlas.json entries into web/_js/atlas.js (mind the edits!)
|
||||
b. Copy temp-read-ids.txt IDs into data/read-ids.txt
|
||||
4. a. Use merge_out.py, or...
|
||||
b. a. Copy temp-atlas.json entries into web/_js/atlas.js (mind the edits!)
|
||||
b. Copy temp-read-ids.txt IDs into data/read-ids.txt
|
||||
5. Create a pull request
|
||||
"""
|
||||
|
||||
import praw
|
||||
from praw import Reddit
|
||||
from praw.models import Submission
|
||||
import json
|
||||
import time
|
||||
import re
|
||||
import traceback
|
||||
from aformatter import format_all, validate
|
||||
from pathlib import Path
|
||||
import humanize
|
||||
from datetime import datetime
|
||||
import secrets
|
||||
|
||||
with open('temp-atlas.json', 'w', encoding='utf-8') as OUT_FILE, open('temp-read-ids.txt', 'w') as READ_IDS_FILE, open('temp-atlas-manual.txt', 'w', encoding='utf-8') as FAIL_FILE:
|
||||
patches_dir = "../data/patches/"
|
||||
Path(patches_dir).mkdir(parents=True, exist_ok=True)
|
||||
|
||||
OUT_FILE_LINES = ['[\n', ']\n']
|
||||
def set_flair(submission, flair):
|
||||
if has_write_access and submission.link_flair_text != flair:
|
||||
flair_choices = submission.flair.choices()
|
||||
flair = next(x for x in flair_choices if x["flair_text_editable"] and flair == x["flair_text"])
|
||||
submission.flair.select(flair["flair_template_id"])
|
||||
|
||||
with open('credentials', 'r') as file:
|
||||
credentials = file.readlines()
|
||||
client_id = credentials[0].strip()
|
||||
client_secret = credentials[1].strip()
|
||||
username = credentials[2].strip() if len(credentials) > 3 else ""
|
||||
password = credentials[3].strip() if len(credentials) > 3 else ""
|
||||
|
||||
reddit = praw.Reddit(
|
||||
client_id=client_id,
|
||||
client_secret=client_secret,
|
||||
username=username,
|
||||
password=password,
|
||||
user_agent='atlas_bot'
|
||||
)
|
||||
with open('credentials', 'r') as file:
|
||||
credentials = file.readlines()
|
||||
client_id = credentials[0].strip()
|
||||
client_secret = credentials[1].strip()
|
||||
username = credentials[2].strip() if len(credentials) > 3 else ""
|
||||
password = credentials[3].strip() if len(credentials) > 3 else ""
|
||||
|
||||
has_write_access = not reddit.read_only
|
||||
if not has_write_access:
|
||||
print("Warning: No write access. Post flairs will not be updated.")
|
||||
time.sleep(5)
|
||||
reddit = Reddit(
|
||||
client_id=client_id,
|
||||
client_secret=client_secret,
|
||||
username=username,
|
||||
password=password,
|
||||
user_agent='atlas_bot'
|
||||
)
|
||||
|
||||
existing_ids = []
|
||||
has_write_access = not reddit.read_only
|
||||
if not has_write_access:
|
||||
print("Warning: No write access. Post flairs will not be updated. Waiting 5 seconds...")
|
||||
# time.sleep(5)
|
||||
|
||||
with open('../data/read-ids.txt', 'r') as edit_ids_file:
|
||||
for id in [x.strip() for x in edit_ids_file.readlines()]:
|
||||
existing_ids.append(id)
|
||||
print("Running...")
|
||||
|
||||
def set_flair(submission, flair):
|
||||
if has_write_access and submission.link_flair_text != flair:
|
||||
flair_choices = submission.flair.choices()
|
||||
flair = next(x for x in flair_choices if x["flair_text_editable"] and flair == x["flair_text"])
|
||||
submission.flair.select(flair["flair_template_id"])
|
||||
existing_ids = []
|
||||
|
||||
total_all_flairs = 0
|
||||
duplicate_count = 0
|
||||
failcount = 0
|
||||
successcount = 0
|
||||
totalcount = 0
|
||||
with open('../data/read-ids.txt', 'r') as edit_ids_file:
|
||||
for id in [x.strip() for x in edit_ids_file.readlines()]:
|
||||
existing_ids.append(id)
|
||||
|
||||
for submission in reddit.subreddit('placeAtlas2').new(limit=2000):
|
||||
total_all_flairs = 0
|
||||
count_dupe = 0
|
||||
count_fail = 0
|
||||
count_success = 0
|
||||
count_total = 0
|
||||
|
||||
with open('temp-atlas-manual.txt', 'w', encoding='utf-8') as FAIL_FILE:
|
||||
|
||||
submission: Submission
|
||||
for submission in reddit.subreddit('placeAtlas2').new(limit=1000):
|
||||
total_all_flairs += 1
|
||||
|
||||
if (submission.id in existing_ids):
|
||||
set_flair(submission, "Processed Entry")
|
||||
print("Found first duplicate!")
|
||||
duplicate_count += 1
|
||||
if (duplicate_count > 0):
|
||||
break
|
||||
else:
|
||||
continue
|
||||
print(f"{submission.id}: Submitted {humanize.naturaltime(datetime.utcnow() - datetime.utcfromtimestamp(submission.created_utc))}.")
|
||||
|
||||
if submission.link_flair_text == "New Entry" or submission.link_flair_text == "Edit Entry":
|
||||
# print(patches_dir + 'reddit-' + submission.id + '.json')
|
||||
if submission.id in existing_ids or Path(patches_dir + 'reddit-' + submission.id + '.json').is_file():
|
||||
set_flair(submission, "Processed Entry")
|
||||
print(f"{submission.id}: Submission is a duplicate! Skipped.")
|
||||
if (count_dupe == 1):
|
||||
print(f"{submission.id}: Second duplicate. Stopped!")
|
||||
break
|
||||
print(f"{submission.id}: First duplicate. Continue running.")
|
||||
count_dupe += 1
|
||||
continue
|
||||
|
||||
print(f"{submission.id}: Processing...")
|
||||
|
||||
if submission.link_flair_text == "New Entry" or submission.link_flair_text == "Edit Entry" or True:
|
||||
|
||||
try:
|
||||
|
||||
|
@ -102,16 +118,11 @@ with open('temp-atlas.json', 'w', encoding='utf-8') as OUT_FILE, open('temp-read
|
|||
if submission_json:
|
||||
|
||||
if submission.link_flair_text == "Edit Entry":
|
||||
|
||||
assert submission_json["id"] != 0, "Edit invalid because ID is tampered, it must not be 0!"
|
||||
|
||||
submission_json_dummy = {"id": submission_json["id"], "edit": submission.id}
|
||||
|
||||
assert submission_json["id"] > 0, "Edit invalid because ID is tampered, it must not be 0 or -1!"
|
||||
else:
|
||||
|
||||
assert submission_json["id"] == 0, "Edit invalid because ID is tampered, it must be 0!"
|
||||
|
||||
submission_json_dummy = {"id": submission.id}
|
||||
assert submission_json["id"] <= 0, "Addition invalid because ID is tampered, it must be 0 or -1!"
|
||||
|
||||
submission_json_dummy = {"id": submission_json["id"], "_reddit_id": submission.id, "_author": submission.author.name}
|
||||
|
||||
for key in submission_json:
|
||||
if not key in submission_json_dummy:
|
||||
|
@ -121,13 +132,11 @@ with open('temp-atlas.json', 'w', encoding='utf-8') as OUT_FILE, open('temp-read
|
|||
|
||||
assert validation_status < 3, \
|
||||
"Submission invalid after validation. This may be caused by not enough points on the path."
|
||||
|
||||
with open(f'{patches_dir}reddit-{submission.id}-{"-".join(submission["name"].split()).lower()}.json', 'w', encoding='utf-8') as out_file:
|
||||
out_file.write(json.dumps(submission_json, ensure_ascii=False))
|
||||
|
||||
add_comma_line = len(OUT_FILE_LINES) - 2
|
||||
if len(OUT_FILE_LINES[add_comma_line]) > 2:
|
||||
OUT_FILE_LINES[add_comma_line] = OUT_FILE_LINES[add_comma_line].replace('\n', ',\n')
|
||||
OUT_FILE_LINES.insert(len(OUT_FILE_LINES) - 1, json.dumps(submission_json, ensure_ascii=False) + '\n')
|
||||
READ_IDS_FILE.write(submission.id + '\n')
|
||||
successcount += 1
|
||||
count_success += 1
|
||||
set_flair(submission, "Processed Entry")
|
||||
|
||||
except Exception as e:
|
||||
|
@ -140,12 +149,11 @@ with open('temp-atlas.json', 'w', encoding='utf-8') as OUT_FILE, open('temp-read
|
|||
"==== CLEAN ====" + "\n\n" +
|
||||
text + "\n\n"
|
||||
)
|
||||
failcount += 1
|
||||
count_fail += 1
|
||||
set_flair(submission, "Rejected Entry")
|
||||
print(f"{submission.id}: Something went wrong! Rejected.")
|
||||
|
||||
print("Wrote " + submission.id + ", submitted " + str(round(time.time()-submission.created_utc)) + " seconds ago")
|
||||
totalcount += 1
|
||||
count_total += 1
|
||||
print(f"{submission.id}: Processed!")
|
||||
|
||||
OUT_FILE.writelines(OUT_FILE_LINES)
|
||||
|
||||
print(f"\n\nTotal all flairs: {total_all_flairs}\nSuccess: {successcount}/{totalcount}\nFail: {failcount}/{totalcount}\nPlease check temp-atlas-manual.txt for failed entries to manually resolve.")
|
||||
print(f"\n\nTotal all flairs: {total_all_flairs}\nSuccess: {count_success}/{count_total}\nFail: {count_fail}/{count_total}\nPlease check temp-atlas-manual.txt for failed entries to manually resolve.")
|
||||
|
|
|
@ -1 +1,3 @@
|
|||
praw
|
||||
praw
|
||||
tqdm
|
||||
humanize
|
|
@ -1,10 +1,12 @@
|
|||
#!/usr/bin/python
|
||||
|
||||
from io import TextIOWrapper
|
||||
import json
|
||||
import traceback
|
||||
import numpy
|
||||
from PIL import Image, ImageDraw
|
||||
import gc
|
||||
import tqdm
|
||||
|
||||
"""
|
||||
# 166 to 164 with reference of 165
|
||||
|
@ -147,16 +149,17 @@ def remove_white(entry: dict):
|
|||
|
||||
return entry
|
||||
|
||||
def per_line_entries(entries: list):
|
||||
def per_line_entries(entries: list, file: TextIOWrapper):
|
||||
"""
|
||||
Returns a string of all the entries, with every entry in one line.
|
||||
"""
|
||||
out = "[\n"
|
||||
for entry in entries:
|
||||
if entry:
|
||||
out += json.dumps(entry, ensure_ascii=False) + ",\n"
|
||||
out = out[:-2] + "\n]"
|
||||
return out
|
||||
file.write("[\n")
|
||||
line_temp = ""
|
||||
for entry in tqdm.tqdm(entries):
|
||||
if line_temp:
|
||||
file.write(line_temp + ",\n")
|
||||
line_temp = json.dumps(entry, ensure_ascii=False)
|
||||
file.write(line_temp + "\n]")
|
||||
|
||||
def format_all(entry: dict, silent=False):
|
||||
def print_(*args, **kwargs):
|
||||
|
@ -168,7 +171,7 @@ def format_all(entry: dict, silent=False):
|
|||
return entry
|
||||
|
||||
def scale_back_entries(entries):
|
||||
for i in range(len(entries)):
|
||||
for i in tqdm.trange(len(entries)):
|
||||
try:
|
||||
entry_formatted = format_all(entries[i], True)
|
||||
entries[i] = entry_formatted
|
||||
|
@ -191,7 +194,7 @@ def go(path):
|
|||
print(f"{len(entries)} checked. Writing...")
|
||||
|
||||
with open(path, "w", encoding='utf-8', newline='\n') as f2:
|
||||
f2.write(per_line_entries(entries))
|
||||
per_line_entries(entries, f2)
|
||||
|
||||
print("Writing completed. All done.")
|
||||
|
||||
|
|
130
tools/schema/atlas.json
Normal file
130
tools/schema/atlas.json
Normal file
|
@ -0,0 +1,130 @@
|
|||
{
|
||||
"$schema": "https://json-schema.org/draft-07/schema",
|
||||
"type": "array",
|
||||
"definitions": {
|
||||
"entry": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "integer",
|
||||
"minimum": 1
|
||||
},
|
||||
{
|
||||
"type": "integer",
|
||||
"minimum": -1,
|
||||
"maximum": 0,
|
||||
"description": "The ID of the entry. The value is a placeholder for new entries."
|
||||
}
|
||||
],
|
||||
"description": "The ID of the entry. Usually, this is a number (string or number) this is the post ID of the new entry submission."
|
||||
},
|
||||
"name": {
|
||||
"type": "string",
|
||||
"description": "The short, descriptive name of the entry.",
|
||||
"minLength": 1
|
||||
},
|
||||
"description": {
|
||||
"type": "string",
|
||||
"description": "The description of the entry. that will also be understood by somebody not familiar with the topic. Usually, the first sentence on Wikipedia is a good example."
|
||||
},
|
||||
"links": {
|
||||
"type": "object",
|
||||
"description": "The links related to the entry.",
|
||||
"properties": {
|
||||
"subreddit": {
|
||||
"type": "array",
|
||||
"description": "Subreddits that's either most relevant to the topic, or that was responsible for creating the artwork, excluding the r/.",
|
||||
"items": {
|
||||
"type": "string",
|
||||
"description": "A subreddit that's either most relevant to the topic, or that was responsible for creating the artwork.",
|
||||
"pattern": "^[A-Za-z0-9][A-Za-z0-9_]{1,20}$",
|
||||
"minItems": 1
|
||||
}
|
||||
},
|
||||
"website": {
|
||||
"type": "array",
|
||||
"description": "URL to websites related to the entry, including the http/https protocol. If you're describing a project, the project's main website would be suitable here.",
|
||||
"items": {
|
||||
"type": "string",
|
||||
"description": "The URL to a website related to the entry.",
|
||||
"pattern": "^https?://[^\\s/$.?#].[^\\s]*$",
|
||||
"minItems": 1
|
||||
}
|
||||
},
|
||||
"discord": {
|
||||
"type": "array",
|
||||
"description": "Invite codes of Discord servers related to the entry (excluding discord.gg/)",
|
||||
"items": {
|
||||
"type": "string",
|
||||
"description": "The invite code of a Discord server related to the entry.",
|
||||
"minItems": 1,
|
||||
"minLength": 1
|
||||
}
|
||||
},
|
||||
"wiki": {
|
||||
"type": "array",
|
||||
"description": "Wiki pages related to the entry.",
|
||||
"items": {
|
||||
"type": "string",
|
||||
"description": "The title of the wiki page related to the entry.",
|
||||
"minItems": 1,
|
||||
"minLength": 1
|
||||
}
|
||||
}
|
||||
},
|
||||
"additionalProperties": false
|
||||
},
|
||||
"path": {
|
||||
"type": "object",
|
||||
"description": "The path of the entry.",
|
||||
"patternProperties": {
|
||||
"^(\\d+(-\\d+)?|\\w+(:\\d+(-\\d+)?)?)(, (\\d+(-\\d+)?|\\w+(:\\d+(-\\d+)?)?))*$": {
|
||||
"type": "array",
|
||||
"description": "A period containing the path points.",
|
||||
"items": {
|
||||
"type": "array",
|
||||
"description": "A point.",
|
||||
"items": {
|
||||
"type": "number"
|
||||
},
|
||||
"minItems": 2,
|
||||
"maxItems": 2
|
||||
},
|
||||
"minItems": 3
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"minProperties": 1
|
||||
},
|
||||
"center": {
|
||||
"type": "object",
|
||||
"description": "The center of the entry.",
|
||||
"patternProperties": {
|
||||
"^(\\d+(-\\d+)?|\\w+(:\\d+(-\\d+)?)?)(, (\\d+(-\\d+)?|\\w+(:\\d+(-\\d+)?)?))*$": {
|
||||
"type": "array",
|
||||
"description": "A period containing the center point.",
|
||||
"items": {
|
||||
"type": "number",
|
||||
"description": "A point."
|
||||
},
|
||||
"minItems": 2,
|
||||
"maxItems": 2
|
||||
}
|
||||
},
|
||||
"additionalProperties": false,
|
||||
"minProperties": 1
|
||||
}
|
||||
},
|
||||
"required": ["id", "name", "description", "links", "path", "center"],
|
||||
"additionalItems": true
|
||||
}
|
||||
},
|
||||
"items": {
|
||||
"$ref": "#/definitions/entry"
|
||||
}
|
||||
}
|
16
tools/schema/patch.json
Normal file
16
tools/schema/patch.json
Normal file
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"$schema": "https://json-schema.org/draft-07/schema",
|
||||
"$ref": "atlas.json#/definitions/entry",
|
||||
"properties": {
|
||||
"_author": {
|
||||
"type": "string",
|
||||
"description": "Patch only: Author of the entry.",
|
||||
"minLength": 1
|
||||
},
|
||||
"_reddit_id": {
|
||||
"type": "string",
|
||||
"description": "Patch only: Submission ID, if submitted from Reddit.",
|
||||
"minLength": 1
|
||||
}
|
||||
}
|
||||
}
|
|
@ -5,17 +5,35 @@
|
|||
* Licensed under AGPL-3.0 (https://place-atlas.stefanocoding.me/license.txt)
|
||||
*/
|
||||
|
||||
const redditWrapperEl = document.querySelector('#reddit-contributors-wrapper')
|
||||
const contributorsEl = document.querySelector('#contributors-wrapper')
|
||||
|
||||
// <i aria-label="GitHub" class="bi bi-github"></i>
|
||||
const gitHubEl = document.createElement("i")
|
||||
gitHubEl.ariaLabel = "GitHub:"
|
||||
gitHubEl.className = "bi bi-github"
|
||||
|
||||
fetch('all-authors.txt')
|
||||
.then(response => response.text())
|
||||
.then(text => text.trim().split('\n').sort())
|
||||
.then(text => text.trim().split('\n').sort((a, b) => {
|
||||
const aSplit = a.split(':')
|
||||
const bSplit = b.split(':')
|
||||
return aSplit[aSplit.length - 1] > bSplit[bSplit.length - 1]
|
||||
}))
|
||||
.then(contributors => {
|
||||
document.querySelector('#reddit-contributors-count').textContent = contributors.length
|
||||
document.querySelector('#contributors-count').textContent = contributors.length
|
||||
for (const contributor of contributors) {
|
||||
const userEl = document.createElement('a')
|
||||
userEl.href = 'https://reddit.com/user/' + contributor
|
||||
userEl.textContent = contributor
|
||||
redditWrapperEl.appendChild(userEl)
|
||||
redditWrapperEl.appendChild(document.createTextNode(' '))
|
||||
const contributorSplit = contributor.split(':')
|
||||
if (contributorSplit[0] === "gh") {
|
||||
const contributor1 = contributorSplit[1]
|
||||
userEl.href = 'https://github.com/' + contributor1
|
||||
userEl.appendChild(gitHubEl.cloneNode())
|
||||
userEl.appendChild(document.createTextNode(' ' + contributor1))
|
||||
} else {
|
||||
userEl.href = 'https://reddit.com/user/' + contributor
|
||||
userEl.textContent = contributor
|
||||
}
|
||||
contributorsEl.appendChild(userEl)
|
||||
contributorsEl.appendChild(document.createTextNode(' '))
|
||||
}
|
||||
})
|
|
@ -260,7 +260,7 @@ function initDraw() {
|
|||
|
||||
function generateExportObject() {
|
||||
const exportObject = {
|
||||
id: entryId ?? 0,
|
||||
id: entryId ?? -1,
|
||||
name: nameField.value,
|
||||
description: descriptionField.value,
|
||||
links: {},
|
||||
|
|
|
@ -500,7 +500,9 @@ async function init() {
|
|||
}
|
||||
|
||||
function updateAtlasAll(atlas = atlasAll) {
|
||||
for (const entry of atlas) {
|
||||
for (const index in atlas) {
|
||||
const entry = atlas[index]
|
||||
entry._index = index
|
||||
const currentLinks = entry.links
|
||||
entry.links = {
|
||||
website: [],
|
||||
|
|
|
@ -106,18 +106,20 @@ async function updateBackground(newPeriod = currentPeriod, newVariation = curren
|
|||
}
|
||||
const canvas = document.createElement('canvas')
|
||||
const context = canvas.getContext('2d')
|
||||
for await (const url of layerUrls) {
|
||||
|
||||
layers.length = layerUrls.length
|
||||
await Promise.all(layerUrls.map(async (url, i) => {
|
||||
const imageLayer = new Image()
|
||||
await new Promise(resolve => {
|
||||
imageLayer.onload = () => {
|
||||
context.canvas.width = Math.max(imageLayer.width, context.canvas.width)
|
||||
context.canvas.height = Math.max(imageLayer.height, context.canvas.height)
|
||||
layers.push(imageLayer)
|
||||
layers[i] = imageLayer
|
||||
resolve()
|
||||
}
|
||||
imageLayer.src = url
|
||||
})
|
||||
}
|
||||
}))
|
||||
|
||||
for (const imageLayer of layers) {
|
||||
context.drawImage(imageLayer, 0, 0)
|
||||
|
|
|
@ -370,10 +370,10 @@ function buildObjectsList(filter, sort = defaultSort) {
|
|||
sortFunction = (a, b) => b.name.toLowerCase().localeCompare(a.name.toLowerCase())
|
||||
break
|
||||
case "newest":
|
||||
sortFunction = (a, b) => b.id.length - a.id.length || b.id.localeCompare(a.id)
|
||||
sortFunction = (a, b) => b._index - a._index
|
||||
break
|
||||
case "oldest":
|
||||
sortFunction = (a, b) => a.id.length - b.id.length || a.id.localeCompare(b.id)
|
||||
sortFunction = (a, b) => a._index - b._index
|
||||
break
|
||||
case "area":
|
||||
sortFunction = (a, b) => calcPolygonArea(b.path) - calcPolygonArea(a.path)
|
||||
|
|
|
@ -180,10 +180,10 @@
|
|||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
<h3>Reddit Contributors</h3>
|
||||
<p>The 2022 Atlas would not have been possible without the help of our <span id="reddit-contributors-count"></span> Reddit contributors.</p>
|
||||
<h3>Contributors</h3>
|
||||
<p>The 2022 Atlas would not have been possible without the help of our <span id="contributors-count"></span> contributors.</p>
|
||||
<p>Thank you to everyone who submitted new entries, amended existing ones, reported bugs and just supported the project in general.</p>
|
||||
<div id="reddit-contributors-wrapper" style="text-align: justify;"></div>
|
||||
<div id="contributors-wrapper" style="text-align: justify;"></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-5 col-xl-4">
|
||||
|
|
|
@ -5075,7 +5075,6 @@ JohnnyHotshot
|
|||
-robotic
|
||||
olly
|
||||
Shadox
|
||||
Ericbazinga
|
||||
MingCate
|
||||
SlipsSC_
|
||||
carlyc999
|
||||
|
@ -5128,7 +5127,7 @@ p1terdeN
|
|||
IncestSimulator2016
|
||||
zephyr12345
|
||||
Blizhazard
|
||||
Fishes_Glubs & GamerKingFaiz
|
||||
GamerKingFaiz
|
||||
Wodgam
|
||||
TheNomad
|
||||
VinsElBins
|
||||
|
@ -5160,7 +5159,6 @@ neurospex
|
|||
soopimus_
|
||||
SporekidX
|
||||
ForsenPlace
|
||||
scorpion24100 / ThePizzaMuncher
|
||||
Vapku
|
||||
BouchonEnPlastique
|
||||
SailorElei
|
||||
|
@ -5298,7 +5296,6 @@ HappyMerlin
|
|||
YummyGummyDrops
|
||||
Forcoy
|
||||
RookeMistake
|
||||
slanterns
|
||||
raudrin
|
||||
AriaNoire
|
||||
evaroussel
|
||||
|
@ -5407,4 +5404,158 @@ Hellmustang0226
|
|||
tiny2ooons
|
||||
duroki66
|
||||
Aloxite
|
||||
Polygonboy0
|
||||
Polygonboy0
|
||||
Loic78570
|
||||
TheSuperR5
|
||||
Obi7199
|
||||
Sollydeu
|
||||
NoNeedleworker531
|
||||
SadPinguu
|
||||
NoelleTGS
|
||||
christmasmanexists
|
||||
TheDrew-23
|
||||
BurakOdm
|
||||
Nobodytheinvisible
|
||||
poundmycake
|
||||
PewdiepieFanBoi69420
|
||||
yaseensherif_
|
||||
KingKurto_
|
||||
Xtheman1674
|
||||
Frolainheu
|
||||
Grahnolaxwastaken
|
||||
GwendolynGravers
|
||||
Mundane_Board_3277
|
||||
include_username_h
|
||||
LuffytheRocky
|
||||
profemain
|
||||
Budgerigar17
|
||||
guyguy46383758
|
||||
Kapt0
|
||||
SkayaTheKarp
|
||||
The-Drumstick-Empire
|
||||
Downtown-Stand1109
|
||||
No_Ad3819
|
||||
Hans5958_
|
||||
TheKingOfKings75
|
||||
Randomcoolvids_YT
|
||||
DOMOHEAD
|
||||
Macaroni_TheSecond
|
||||
TED2622
|
||||
Typical_Attention105
|
||||
Afraid_Success_4836
|
||||
Choice_Ad_9562
|
||||
NjordLum
|
||||
MarvelsSpooderMan
|
||||
Gagas33
|
||||
Mistery_0
|
||||
Nacil_54
|
||||
moreorlesser
|
||||
TheRedstoneRazor
|
||||
Intelligent_Image975
|
||||
ThatYugoGuy
|
||||
m654zy
|
||||
imskyyc
|
||||
Eaglewolf13
|
||||
Spaceman333_exe
|
||||
FishingGuppy
|
||||
cyingbabyy
|
||||
ESoreos
|
||||
Veroune_
|
||||
Senior_Broccoli5485
|
||||
MisssSheep
|
||||
Licensed2Chill
|
||||
THE_BATTEUR
|
||||
Furry_Memelord
|
||||
usernameista
|
||||
ventureDIIIIED
|
||||
beepumbra
|
||||
bubuarana
|
||||
Last-Adhesiveness-84
|
||||
srurskem
|
||||
itsnotgood1337
|
||||
PrteaRhea
|
||||
Linkinito
|
||||
surelychoo
|
||||
Dizzy-Office-307
|
||||
theswannwholaughs
|
||||
Leotzuafk
|
||||
DavidCZ200510
|
||||
Mathlenormand
|
||||
hydre39
|
||||
Hajimes_acid
|
||||
FNAFB_true_fan
|
||||
Canada_LaVearn
|
||||
Break_Emotional
|
||||
LukenTosa
|
||||
Rydoggo5392
|
||||
Lait--Fraise
|
||||
Fishcracks13
|
||||
ilyessboui
|
||||
Ronkad
|
||||
OrmanRedwood
|
||||
jamontamo
|
||||
Pikafreak108
|
||||
Pugo0
|
||||
Suspicious_Price2037
|
||||
Mystichunterz
|
||||
recitedStrawfox
|
||||
lsoroc
|
||||
Lioli_
|
||||
Key-Control8107
|
||||
How-did-we-get-here3
|
||||
r0xANDt0l
|
||||
sqrtney
|
||||
mr_terms_and_perms
|
||||
Greyflex
|
||||
Chandler8105
|
||||
Raider440
|
||||
zonkerberg
|
||||
Strayox
|
||||
Fincunder
|
||||
Rexzilarate-2
|
||||
kuane2000
|
||||
f0rmidablez
|
||||
PhireKappa
|
||||
SolkaP7
|
||||
Left-Ambition-5127
|
||||
Nihekan368
|
||||
parkas_
|
||||
hydrielax
|
||||
Sfa11305
|
||||
Yeet_Away_The_Pain
|
||||
Inevitable_Sail_826
|
||||
WtvrBro
|
||||
Evaberer
|
||||
SunnyM0on
|
||||
Teblefer
|
||||
nuwenlee
|
||||
heevanington
|
||||
OJack18
|
||||
TheRealDunko
|
||||
Podongos
|
||||
Muff3ntop
|
||||
Spyne34
|
||||
Enyrox
|
||||
SkalxV
|
||||
Consistent_Squirrel
|
||||
Living_Psychology108
|
||||
TapleStape
|
||||
Eldipypapuh
|
||||
TrollusRT
|
||||
skitou
|
||||
KingSammelot
|
||||
Adventurous-Rock5765
|
||||
AldoSpacewool
|
||||
tipoima
|
||||
TempleTerry
|
||||
IntelligentHat2308
|
||||
Hatsuku39
|
||||
johnthesoap
|
||||
ktwombley
|
||||
SomeFrenchFurry
|
||||
elijahthetrashman
|
||||
GamesTheOracle
|
||||
waddleguin
|
||||
GDJosef
|
||||
eri531
|
||||
-Yox-
|
||||
|
|
|
@ -1495,7 +1495,7 @@
|
|||
{"id": "twtde8", "name": "Fattypillow", "description": "Popular czech YouTuber and streamer!", "links": {"subreddit": ["Fattypillow"]}, "path": {"109-165, T:0-1": [[1536, 1233], [1599, 1233], [1599, 1249], [1536, 1249]]}, "center": {"109-165, T:0-1": [1568, 1241]}},
|
||||
{"id": "twtday", "name": "Mizutsune", "description": "This is the icon of Mizutsune, a popular monster from the Monster Hunter series. It has appeared in several Monster Hunter games and is beloved by its fans for its colourful apperance and the ability to produce bubbles, which led to the nickname \"bubble fox\". In the top-right corner, The initials of the game can be seen. They are written in the color scheme of the France flag, in memoriam of the small neighboring France flag, which was destroyed towards the end. This artwork was allied with the Destiny artwork above, The vegan banner to the left and with the BazzaGazza logo on the right. The whole story can be read here: https://www.reddit.com/r/monster_hunter_place/comments/twx9yq/looking_back_on_rplace_2022_the_full_timeline/", "links": {"website": ["https://monsterhunter.fandom.com/wiki/Mizutsune"], "subreddit": ["MonsterHunter"]}, "path": {"109-166, T:0-1": [[1908, 1562], [1908, 1600], [1874, 1600], [1875, 1562]]}, "center": {"109-166, T:0-1": [1891, 1581]}},
|
||||
{"id": "twtd6i", "name": "Star Academy (reste du logo)", "description": "Les restes du logo Star Academy (rip petitanj) créé le 04/04/2022 par les doux dingues sur la chaîne Flonflon_musique", "links": {"website": ["https://www.twitch.tv/flonflon_musique"]}, "path": {"109-165, T:0-1": [[1040, 1315], [1038, 1331], [1053, 1332], [1055, 1302]]}, "center": {"109-165, T:0-1": [1047, 1320]}},
|
||||
{"id": "twtd4j", "name": "Shiny Chatot", "description": "Chatot is a Normal/Flying-type parrot Pokémon from the Generation 4 Pokémon games (Pokémon Diamond/Pearl). This Chatot has a rare Shiny color, with pink wings instead of the usual blue.\n\nThis art was drawn by Chatot Dungeon, a spinoff group from the Twitch channel Twitch Plays Pokémon that formed in 2014 after getting timed out in chat for saying \"I like Chatot\". The Chatot was later turned Shiny by French Pokémon YouTuber (PokéTuber) Sneaze, whose mascot is a Shiny Chatot named Mastouffe. ", "links": {"website": ["https://www.youtube.com/channel/UCQjurXV2DUU1LU2FiSWamIg", "https://bulbapedia.bulbagarden.net/wiki/Chatot_(Pok%C3%A9mon)"], "subreddit": ["pokemon"]}, "path": {"3-15": [[80, 742], [76, 744], [74, 746], [73, 749], [73, 752], [75, 754], [78, 754], [78, 755], [80, 757], [87, 758], [87, 755], [90, 752], [90, 749], [92, 749], [92, 744], [89, 744], [89, 742]], "16-166, T:0-1": [[82, 743], [80, 745], [80, 748], [76, 750], [74, 752], [74, 754], [73, 755], [73, 758], [75, 760], [78, 760], [78, 761], [80, 763], [85, 764], [87, 762], [87, 761], [90, 758], [90, 755], [92, 755], [92, 750], [89, 750], [89, 748], [86, 745]]}, "center": {"3-15": [83, 749], "16-166, T:0-1": [83, 755]}},
|
||||
{"id": "twtd4j", "name": "Shiny Chatot", "description": "Chatot is a Normal/Flying-type parrot Pokémon from the Generation 4 Pokémon games (Pokémon Diamond/Pearl). This Chatot has a rare Shiny color, with pink wings instead of the usual blue.\n\nThis art was drawn by Chatot Dungeon, a spinoff group from the Twitch channel Twitch Plays Pokémon that formed in 2014 after getting timed out in chat for saying \"I like Chatot\". The Chatot was later turned Shiny by French Pokémon YouTuber (PokéTuber) Sneaze, whose mascot is a Shiny Chatot named Mastouffe.", "links": {"website": ["https://www.youtube.com/channel/UCQjurXV2DUU1LU2FiSWamIg", "https://bulbapedia.bulbagarden.net/wiki/Chatot_(Pok%C3%A9mon)"], "subreddit": ["pokemon"]}, "path": {"3-15": [[80, 742], [76, 744], [74, 746], [73, 749], [73, 752], [75, 754], [78, 754], [78, 755], [80, 757], [87, 758], [87, 755], [90, 752], [90, 749], [92, 749], [92, 744], [89, 744], [89, 742]], "16-166, T:0-1": [[82, 743], [80, 745], [80, 748], [76, 750], [74, 752], [74, 754], [73, 755], [73, 758], [75, 760], [78, 760], [78, 761], [80, 763], [85, 764], [87, 762], [87, 761], [90, 758], [90, 755], [92, 755], [92, 750], [89, 750], [89, 748], [86, 745]]}, "center": {"3-15": [83, 749], "16-166, T:0-1": [83, 755]}},
|
||||
{"id": "twtd3k", "name": "Girls' Frontline", "description": "Girls' Frontline is a turn-based strategy gacha game for mobile, developed by China-based studio MICA Team. In the game, players control androids known as \"T-Dolls\" who all carry versions of real-life firearms. The logo is the acronym of the game with a silhouette of the game's protagonist, M4A1. To the lower right of the logo is a pixelated version of HK416, one of the game's main characters and a member of Squad 404.", "links": {"website": ["https://gf.sunborngame.com/", "https://en.wikipedia.org/wiki/Girls%27_Frontline"], "subreddit": ["girlsfrontline"]}, "path": {"8-20": [[300, 935], [300, 942], [310, 942], [310, 935]], "62-166, T:0-1": [[1765, 725], [1769, 725], [1771, 722], [1768, 720], [1768, 717], [1769, 717], [1769, 715], [1768, 715], [1768, 711], [1769, 711], [1770, 710], [1771, 710], [1772, 707], [1772, 708], [1775, 705], [1776, 705], [1780, 709], [1780, 711], [1779, 711], [1779, 713], [1781, 716], [1783, 717], [1784, 720], [1785, 723], [1787, 726], [1790, 725], [1790, 737], [1790, 742], [1790, 747], [1788, 747], [1787, 748], [1783, 747], [1782, 746], [1780, 745], [1776, 745], [1774, 744], [1772, 743], [1764, 743], [1762, 740], [1762, 739], [1764, 737], [1765, 737]]}, "center": {"8-20": [305, 939], "62-166, T:0-1": [1778, 733]}},
|
||||
{"id": "twtd2d", "name": "Together Through Time", "description": "The logo for the 2018 album Together Through Time, The first full-length studio album and sixth record by Canadian 80's future space band TWRP. This art was drawn by fan communities across Reddit and Discord.", "links": {"website": ["https://twrp.fandom.com/wiki/Together_Through_Time"], "subreddit": ["TWRP"]}, "path": {"109-166, T:0-1": [[1035, 1774], [1033, 1770], [1032, 1765], [1034, 1761], [1036, 1757], [1040, 1754], [1042, 1753], [1052, 1753], [1055, 1755], [1059, 1760], [1061, 1764], [1061, 1772], [1058, 1776], [1053, 1779], [1039, 1779], [1036, 1777], [1035, 1775]]}, "center": {"109-166, T:0-1": [1047, 1766]}},
|
||||
{"id": "twtcu7", "name": "Flag of Kenya", "description": "Kenya is a country in East Africa. The flag of Kenya displays its signature Maasai shield.", "links": {"website": ["https://en.wikipedia.org/wiki/Kenya", "https://en.wikipedia.org/wiki/Flag_of_Kenya"], "subreddit": ["Kenya"]}, "path": {"56-108": [[781, 981], [781, 1000], [796, 1000], [796, 981]], "31-49": [[697, 984], [697, 999], [728, 999], [728, 984]], "5-30": [[681, 984], [681, 999], [728, 999], [728, 984]], "109-165, T:0-1": [[781, 979], [781, 1017], [796, 1017], [796, 982], [794, 980], [787, 980], [787, 979]]}, "center": {"56-108": [789, 991], "31-49": [713, 992], "5-30": [705, 992], "109-165, T:0-1": [789, 998]}},
|
||||
|
@ -7231,7 +7231,7 @@
|
|||
{"id": "u8fwzq", "name": "Lucas", "description": "Lucas is the player character and protagonist of Mother 3. This sprite is from his appearance in Snowcap Mountain.", "links": {"website": ["https://earthbound.fandom.com/wiki/Lucas"], "subreddit": ["earthbound"]}, "path": {"160-167, T:0-1": [[1971, 348], [1968, 351], [1968, 352], [1969, 353], [1968, 354], [1970, 356], [1972, 356], [1974, 354], [1973, 353], [1974, 352], [1974, 350], [1972, 348]]}, "center": {"160-167, T:0-1": [1971, 351]}},
|
||||
{"id": "u8fpy5", "name": "Flanders poppy", "description": "The Flanders poppy, also called the Anzac poppy, is a red flower in the poppy family. They are a symbol of Anzacs (soldiers of the Australian and New Zealand Army Corps), and are worn on November 11 (Remembrance Day) and April 25 (Anzac Day) to commemorate Australian and New Zealander soldiers who died in World War I.\n\nThis art was made by a small Discord.", "links": {"website": ["https://nzhistory.govt.nz/war/anzac-day/poppies", "https://en.wikipedia.org/wiki/Papaver_rhoeas", "https://en.wikipedia.org/wiki/Anzac_Day"]}, "path": {"149-166, T:0-1": [[352, 720], [352, 732], [363, 732], [363, 720]]}, "center": {"149-166, T:0-1": [358, 726]}},
|
||||
{"id": "u8f1si", "name": "Marisad", "description": "Marisa Kirisame is a character from Touhou Project. In the fan-made anime Fantasy Kaleidoscope ~The Memories of Phantasm~, she is seen crying during one scene. Her funny expression quickly became a popular meme inside the Touhou fandom after being popularized by Touhou YouTuber Chiruno, and was dubbed \"Marisad\" (Marisa + sad). Additionally the emote of Marisa crying spread across many Touhou-related Discord servers, further adding to its popularity.\n\nSmol Marisad was established by r/marisad as a second project on the canvas. It unfortunately had to destroy the old Rick Astley pixel art that used to cover both Tomoko and the Tani logo. Shortly after Smol Marisad started to take shape, r/watamote began drawing Tomoko. After a short period of bickering between the two groups they finally agreed to borders, and an alliance. In the end both were destroyed before the archiving by the streamer Tanizen, who ordered his followers to draw a dog in their place.", "links": {"website": ["https://en.touhouwiki.net/wiki/Marisa_Kirisame"], "subreddit": ["Marisad", "touhou"], "discord": ["UVkWNdhQ"]}, "path": {"109-166, T:0-1": [[1724, 1199], [1724, 1218], [1740, 1218], [1740, 1199]]}, "center": {"109-166, T:0-1": [1732, 1209]}},
|
||||
{"id": "u8emqw", "name": "Purple hearts", "description": "A purple heart background. Purple represents love and trust between BTS and their fans, the ARMY.", "links": {"website": [], "subreddit": ["bangtan"]}, "path": {"97-104": [[1950, 175], [1943, 182], [1943, 238], [2000, 238], [2000, 194], [1982, 194], [1982, 195], [1989, 203], [1989, 215], [1976, 229], [1976, 238], [1970, 238], [1970, 229], [1958, 217], [1943, 217], [1943, 199], [1961, 199], [1965, 195], [1964, 194], [1960, 194], [1960, 175]], "69-94": [[1943, 194], [1943, 238], [2000, 238], [2000, 194], [1982, 194], [1982, 195], [1989, 203], [1989, 216], [1976, 229], [1976, 238], [1969, 238], [1969, 228], [1958, 217], [1943, 217], [1943, 199], [1962, 199], [1966, 194]], "111-166, T:0-1": [[1966, 194], [1966, 197], [1961, 197], [1959, 199], [1960, 238], [1976, 238], [1977, 238], [1977, 232], [1978, 231], [1977, 230], [1977, 229], [1969, 221], [1969, 220], [1967, 218], [1967, 216], [1966, 215], [1966, 205], [1968, 203], [1968, 202], [1970, 198], [1973, 195], [1992, 195], [1996, 199], [1996, 201], [1998, 203], [1998, 217], [1996, 219], [1996, 220], [1987, 229], [1987, 230], [1986, 231], [1987, 232], [1987, 238], [1999, 238], [1999, 194]]}, "center": {"97-104": [1953, 227], "69-94": [1954, 227], "111-166, T:0-1": [1967, 231]}},
|
||||
{"id": "u8emqw", "name": "Purple hearts", "description": "A purple heart background. Purple represents love and trust between BTS and their fans, the ARMY.", "links": {"subreddit": ["bangtan"]}, "path": {"97-104": [[1950, 175], [1943, 182], [1943, 238], [2000, 238], [2000, 194], [1982, 194], [1982, 195], [1989, 203], [1989, 215], [1976, 229], [1976, 238], [1970, 238], [1970, 229], [1958, 217], [1943, 217], [1943, 199], [1961, 199], [1965, 195], [1964, 194], [1960, 194], [1960, 175]], "69-94": [[1943, 194], [1943, 238], [2000, 238], [2000, 194], [1982, 194], [1982, 195], [1989, 203], [1989, 216], [1976, 229], [1976, 238], [1969, 238], [1969, 228], [1958, 217], [1943, 217], [1943, 199], [1962, 199], [1966, 194]], "111-166, T:0-1": [[1966, 194], [1966, 197], [1961, 197], [1959, 199], [1960, 238], [1976, 238], [1977, 238], [1977, 232], [1978, 231], [1977, 230], [1977, 229], [1969, 221], [1969, 220], [1967, 218], [1967, 216], [1966, 215], [1966, 205], [1968, 203], [1968, 202], [1970, 198], [1973, 195], [1992, 195], [1996, 199], [1996, 201], [1998, 203], [1998, 217], [1996, 219], [1996, 220], [1987, 229], [1987, 230], [1986, 231], [1987, 232], [1987, 238], [1999, 238], [1999, 194]]}, "center": {"97-104": [1953, 227], "69-94": [1954, 227], "111-166, T:0-1": [1967, 231]}},
|
||||
{"id": "u8el5a", "name": "BTS logo", "description": "A tricolor version of the BTS double trapezoid logo.", "links": {"website": ["https://en.wikipedia.org/wiki/BTS"], "subreddit": ["bangtan"]}, "path": {"113-165, T:0-1": [[1942, 207], [1942, 235], [1943, 235], [1948, 230], [1950, 230], [1955, 235], [1956, 234], [1956, 208], [1955, 207], [1950, 212], [1948, 212], [1943, 207]]}, "center": {"113-165, T:0-1": [1949, 221]}},
|
||||
{"id": "u8ek3b", "name": "방탄", "description": "Korean for \"bangtan\", the first word of the band Bangtan Sonyeondan (BTS).", "links": {"website": ["https://en.wikipedia.org/wiki/BTS"], "subreddit": ["bangtan"]}, "path": {"111-153": [[1958, 226], [1958, 236], [1973, 236], [1973, 226]], "71-104": [[1983, 228], [1983, 238], [1997, 238], [1997, 228]], "154-166, T:0-1": [[1958, 226], [1958, 235], [1973, 235], [1973, 226]]}, "center": {"111-153": [1966, 231], "71-104": [1990, 233], "154-166, T:0-1": [1966, 231]}},
|
||||
{"id": "u8eivo", "name": "보라해", "description": "The Korean text \"보라해\" (borahae) is a portmanteau of bora (violet) and saranghae (I love you), and means \"I purple you\". This is a symbol of love between BTS and their fandom, The ARMY, who often associate the color purple with love.", "links": {"website": ["https://www.urbandictionary.com/define.php?term=I%20Purple%20You"], "subreddit": ["bangtan"]}, "path": {"62-104": [[1944, 230], [1944, 238], [1967, 238], [1967, 230]], "112-166, T:0-1": [[1943, 197], [1943, 205], [1965, 205], [1965, 197]]}, "center": {"62-104": [1956, 234], "112-166, T:0-1": [1954, 201]}},
|
||||
|
|
|
@ -344,8 +344,10 @@
|
|||
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
|
||||
</div>
|
||||
<div class="modal-body d-flex flex-column">
|
||||
<p>Use the Post Direct to Reddit button or manually copy the text below and submit it as a new text post to <a href="https://www.reddit.com/r/placeAtlas2/" target="_blank" rel="noopener noreferrer">r/placeAtlas2</a> on Reddit.</p>
|
||||
<p>Don't forget to flair it with the <span class="badge rounded-pill bg-primary"><i class="bi bi-tag" aria-hidden="true"></i> <span id="redditFlair">New Entry</span></span> tag.</p>
|
||||
<p>
|
||||
If you want to use Reddit, use the <span class="badge bg-primary">Post Direct to Reddit</span> button or manually copy the text below and submit it as a new text post to <a href="https://www.reddit.com/r/placeAtlas2/" target="_blank" rel="noopener noreferrer">r/placeAtlas2</a> on Reddit.
|
||||
Don't forget to flair it with the <span class="badge rounded-pill bg-primary"><i class="bi bi-tag" aria-hidden="true"></i> <span id="redditFlair">New Entry</span></span> flair.</p>
|
||||
<p>If you want to use GitHub, read <a href="https://github.com/placeAtlas/atlas/blob/master/CONTRIBUTING.md#through-github" target="_blank" rel="noopener noreferrer">the contributing guide</a> to submit a patch.</p>
|
||||
<p>We will then check it and add it to the Atlas.</p>
|
||||
<textarea class="form-control flex-grow-1" cols="40" rows="20" id="exportString" title="Raw JSON string" readonly></textarea>
|
||||
</div>
|
||||
|
|
Loading…
Add table
Reference in a new issue