Compare commits

...

57 commits

Author SHA1 Message Date
b07420e0bd Merge pull request 'Bump forgejo-release to v2' (#26) from actions-update into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 19s
Reviewed-on: https://forgejo.neshweb.net///Neshura/aob-lemmy-bot/pulls/26
2024-08-06 12:23:26 +00:00
0fc71f0a7d Bump forgejo-release to v2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 1m7s
Build binary file and bundle packages / test (pull_request) Successful in 1m6s
Build binary file and bundle packages / build (pull_request) Successful in 1m10s
2024-08-06 12:19:37 +00:00
2ae6468ad8
Release 3.1.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 47s
Build and Release Binary File / test (push) Successful in 16s
Build and Release Binary File / build (push) Successful in 48s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 37s
2024-07-15 22:15:49 +02:00
2ecfe88cb9
Update to 0.19.5 and include optional thumbnail 2024-07-15 22:15:31 +02:00
7dcc7bfee2
Release 3.0.3
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-08 16:34:48 +02:00
94d8a4e673
Add Timeout to Status Ping HTTP Request
All checks were successful
Run Tests on Code / run-tests (push) Successful in 14s
2024-05-08 16:34:32 +02:00
1b585eab7e
Release 3.0.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 37s
Build and Release Binary File / test (push) Successful in 37s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-07 23:50:14 +02:00
b6f5c38e4a
Overhaul Error handling (Option instead of Result<T, ()> + Logging changes
All checks were successful
Run Tests on Code / run-tests (push) Successful in 15s
2024-05-07 23:49:55 +02:00
5d708bdb82
Release 3.0.1
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 34s
Build and Release Binary File / build (push) Successful in 38s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:50:22 +02:00
6a8c1662f0
Fix enum problems in config
All checks were successful
Run Tests on Code / run-tests (push) Successful in 16s
2024-05-07 22:50:12 +02:00
e02cd900ed
Release 3.0.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 41s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:35:15 +02:00
32ea83a7bb
Release Candidate 3.0.0-rc.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:30:02 +02:00
4297860b9e
Legacy fixes for async traits
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
2024-05-07 22:29:48 +02:00
affe62b973
Release Candidate 3.0.0-rc.1
Some checks failed
Build and Release Binary File / test (push) Failing after 27s
Run Tests on Code / run-tests (push) Failing after 39s
Build and Release Binary File / build (push) Has been skipped
Build and Release Binary File / upload-generic-package (push) Has been skipped
Build and Release Binary File / upload-debian-package (push) Has been skipped
Build and Release Binary File / upload-release (push) Has been skipped
2024-05-07 22:11:16 +02:00
6520cc65a3
Rewrite Version 3
Some checks failed
Run Tests on Code / run-tests (push) Failing after 36s
2024-05-07 22:10:21 +02:00
aefceda628
Release 2.2.8
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-06 22:57:55 +02:00
ee5a159431
Yet another another logging fix
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-05-06 22:57:41 +02:00
85f8b97607
Yet another logging fix
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-05-06 22:57:05 +02:00
966dd8f359
Release 2.2.7
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 31s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-06 22:54:15 +02:00
17e161bc27
Small fixes to logging 2024-05-06 22:54:01 +02:00
3928367692
Release 2.2.6
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 22:36:33 +02:00
070eae961a
Yet more logging and delay between series handling requests (should avoid potential rate limit)
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 22:36:21 +02:00
92103e28ba
Release 2.2.5
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 22:21:14 +02:00
cd78d3c1c7
Reduced connection timeouts and added some logging
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 22:21:04 +02:00
22bbaaa002
Release 2.2.4
All checks were successful
Run Tests on Code / run-tests (push) Successful in 36s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 11s
2024-05-06 21:53:14 +02:00
9ee9db5792
Dependency bumps
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 21:53:02 +02:00
23ac0de189
Release 2.2.3
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 11s
2024-05-06 21:28:00 +02:00
2dc695577e
Further Bugfixing 2024-05-06 21:27:47 +02:00
534a8022a9
Release 2.2.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 38s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 21:17:08 +02:00
45bfca8cc5
Bugfix due to bad deduplication
All checks were successful
Run Tests on Code / run-tests (push) Successful in 13s
2024-05-06 21:16:53 +02:00
34b3bb45c5
Release 2.2.1
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 13s
2024-05-06 21:06:17 +02:00
3e78ce5bf6
Merge pull request 'Various Fixes and Changes' (#21) from hotfix-1 into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 16s
2024-05-06 21:05:07 +02:00
fc4ce74567
Clippy changes
All checks were successful
Build binary file and bundle packages / test (pull_request) Successful in 36s
Run Tests on Code / run-tests (push) Successful in 38s
Build binary file and bundle packages / build (pull_request) Successful in 36s
2024-05-06 21:00:55 +02:00
85f3044224
Lemmy API Crate bump 2024-05-06 20:57:35 +02:00
d50bf01db6
Fix erroneous print statement to use logger 2024-05-06 20:57:32 +02:00
c8d7053b87
De-duplicate Code 2024-05-06 20:57:29 +02:00
962d90fe1d
Remove unnecessary thread structure 2024-05-06 20:57:26 +02:00
36b59240d9
Use macros over functions for error logging 2024-05-06 20:57:20 +02:00
1cd30b1145
Add async-trait crate
Some checks failed
Run Tests on Code / run-tests (push) Failing after 31s
2024-01-08 21:07:49 +01:00
167fcdb7ad
Import fetchers instead of jnovel module in main 2024-01-08 21:07:36 +01:00
b9a26a7b1c
Partially adapt bot module to changes due to fetchers modularization 2024-01-08 21:07:17 +01:00
e5862ba0ec
Move relevant structs to lemmy module 2024-01-08 21:06:52 +01:00
6bd7369ecc
Move jnovel module into fetchers module 2024-01-08 21:06:25 +01:00
ba3110da0e
Add fetchers trait module 2024-01-08 21:06:18 +01:00
0e88326293
Release 2.2.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 42s
Build and Release Binary File / upload-generic-package (push) Successful in 5s
Build and Release Binary File / upload-debian-package (push) Successful in 4s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-01-08 11:07:09 +01:00
a0ff81d582
Remove excessive "Skipping since already posted" logging
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-01-08 11:06:00 +01:00
2d65a12781 Delete deprecated deploy script
All checks were successful
Run Tests on Code / run-tests (push) Successful in 29s
2024-01-08 09:54:40 +00:00
bed8881d70 Delete deprecated systemd service file
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-01-08 09:54:20 +00:00
bb98a6bfce Merge pull request 'Systemd Unit File' (#16) from systemd into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 14s
Reviewed-on: #16
2023-12-30 00:43:11 +00:00
7a6ebd6381
Release Candidate 2.2.0-rc.3 (Fix Debian binary location in systemd file)
All checks were successful
Build and Release Binary File / test (push) Successful in 44s
Build and Release Binary File / build (push) Successful in 40s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
Run Tests on Code / run-tests (push) Successful in 9s
Build binary file and bundle packages / test (pull_request) Successful in 14s
Build binary file and bundle packages / build (pull_request) Successful in 37s
2023-12-30 01:37:16 +01:00
d0375b1d8d
Release Candidate 2.2.0-rc.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 29s
Build binary file and bundle packages / test (pull_request) Successful in 20s
Build and Release Binary File / build (push) Successful in 40s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 22s
Build binary file and bundle packages / build (pull_request) Successful in 38s
2023-12-30 01:31:18 +01:00
75d479b4e2
Add systemd service file
All checks were successful
Run Tests on Code / run-tests (push) Successful in 13s
2023-12-30 01:30:53 +01:00
5e565df7c0 Merge pull request 'Removes the TUI in favor of event based logging' (#15) from remove_tui into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 12s
Reviewed-on: #15
2023-12-30 00:28:46 +00:00
8c1da63e0c
rustfmt
All checks were successful
Run Tests on Code / run-tests (push) Successful in 14s
Build binary file and bundle packages / test (pull_request) Successful in 13s
Build binary file and bundle packages / build (pull_request) Successful in 34s
2023-12-30 01:27:11 +01:00
8be74585a0
Extend logging for new posts
All checks were successful
Run Tests on Code / run-tests (push) Successful in 14s
2023-12-30 01:22:13 +01:00
c3ff578c57
Clean up code 2023-12-30 01:22:04 +01:00
cbb50e5d53
Utilize confy for reading and saving history 2023-12-30 01:02:20 +01:00
14 changed files with 1745 additions and 1160 deletions

View file

@ -137,7 +137,7 @@ jobs:
run: rm release_blobs/build.env
-
name: Release New Version
uses: actions/forgejo-release@v1
uses: actions/forgejo-release@v2
with:
direction: upload
url: https://forgejo.neshweb.net

738
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,33 +1,35 @@
[package]
authors = ["Neshura"]
name = "aob-lemmy-bot"
version = "2.2.0-rc.1"
version = "3.1.0"
edition = "2021"
description = "Bot for automatically posting new chapters of 'Ascendance of a Bookworm' released by J-Novel Club"
license = "GPL-3.0-or-later"
[package.metadata.deb]
extended-description = "Bot for automatically posting new chapters of 'Ascendance of a Bookworm' released by J-Novel Club"
#maintainer-scripts = "debian/" currently disabled since the application is not runnable as a daemon
maintainer-scripts = "debian/"
revision = "1"
depends = ["libc6", "libssl3", "systemd"]
#systemd-units = { enable = false }
systemd-units = { enable = false }
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
chrono = "^0.4.26"
lemmy_api_common = "0.19.1"
lemmy_db_schema = "0.19.1"
once_cell = "^1.18.0"
reqwest = { version = "^0.11.18", features = ["blocking", "json"] }
serde = "^1.0.164"
serde_derive = "^1.0.164"
serde_json = "^1.0.97"
strum_macros = "^0.25.0"
tokio = { version = "^1.32.0", features = ["rt", "rt-multi-thread", "macros"] }
url = "^2.4.0"
confy = "^0.5.1"
toml = "^0.8.8"
chrono = "^0.4"
lemmy_api_common = "0.19.5"
lemmy_db_schema = "0.19.5"
once_cell = "^1.19"
reqwest = { version = "^0.12", features = ["blocking", "json"] }
serde = "^1.0"
serde_derive = "^1.0"
serde_json = "^1.0"
strum_macros = "^0.26"
tokio = { version = "^1.37", features = ["rt", "rt-multi-thread", "macros"] }
url = "^2.5"
confy = "^0.6"
toml = "^0.8"
systemd-journal-logger = "^2.1.1"
log = "^0.4.20"
log = "^0.4"
async-trait = "^0.1"
notify = "6.1.1"

View file

@ -1,8 +0,0 @@
[Unit]
Description="Automod for bookwormstory.social"
[Service]
User=server
WorkingDirectory=/home/server
ExecStart=/usr/bin/screen -dmS automod /home/server/automod
Type=forking

View file

@ -0,0 +1,13 @@
[Unit]
Description="Bot for automatically posting new chapters of 'Ascendance of a Bookworm' released by J-Novel Club"
After=syslog.target
After=network-online.target
[Service]
Type=simple
ExecStart=/usr/bin/aob-lemmy-bot
Restart=always
RestartSec=3
[Install]
WantedBy=multi-user.target

View file

@ -1,6 +0,0 @@
#!/bin/bash
## deploy to machine as automod.new
## stop automod service
## mv automod.new to automod
## restart automod service
## idea: websocket event?

View file

@ -1,259 +1,136 @@
use std::collections::HashMap;
use std::sync::{Arc};
use chrono::{DateTime, Duration, Timelike, Utc};
use lemmy_api_common::post::CreatePost;
use lemmy_db_schema::newtypes::{CommunityId, LanguageId};
use lemmy_db_schema::PostFeatureType;
use tokio::sync::{RwLock};
use crate::{jnovel, lemmy, SharedData, write_error, write_info, write_warn};
use crate::config::{Config, PostBody, SeriesConfig};
use crate::jnovel::PostInfo;
use crate::{config::{Config}, HTTP_CLIENT};
use crate::lemmy::{Lemmy};
use crate::post_history::SeriesHistory;
use crate::post_history::{SeriesHistory};
use chrono::{DateTime, Duration, Utc};
use std::sync::{Arc, RwLock};
use notify::{Event, EventKind, event::{AccessKind, AccessMode}, RecursiveMode, Watcher};
use tokio::time::sleep;
use systemd_journal_logger::connected_to_journal;
pub(crate) async fn run(data: Arc<RwLock<SharedData>>) {
let mut last_reload: DateTime<Utc>;
let mut lemmy: Lemmy;
let mut login_error: bool;
let mut communities;
{
let mut write = data.write().await;
// Errors during bot init are likely unrecoverable and therefore should panic the bot
// Does not really matter since the bot will get restarted anyway but this way the uptime url logs a downtime
write.config = Config::load();
last_reload = Utc::now();
}
{
let read = data.read().await;
lemmy = match lemmy::login(&read.config).await {
Ok(data) => data,
Err(_) => panic!(),
};
login_error = false;
communities = match lemmy.get_communities().await {
Ok(data) => data,
Err(_) => panic!(),
};
}
while Utc::now().naive_local().second() != 30 {
sleep(Duration::milliseconds(100).to_std().unwrap()).await;
}
{
let mut write = data.write().await;
write.start = Utc::now();
}
let info_msg = "Bot init successful, starting normal operations".to_owned();
write_info(info_msg);
loop {
idle(&data).await;
{
let mut write = data.write().await;
write.start = Utc::now();
if write.start - last_reload >= Duration::seconds(write.config.config_reload_seconds as i64) {
write.config = Config::load();
let message = "Config reloaded".to_owned();
write_info(message);
}
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
{
let read = data.read().await;
if login_error {
lemmy = match lemmy::login(&read.config).await {
Ok(data) => data,
Err(_) => continue,
};
login_error = false;
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
{
let read = data.read().await;
if read.start - last_reload >= Duration::seconds(read.config.config_reload_seconds as i64) {
communities = match lemmy.get_communities().await {
Ok(data) => data,
Err(_) => {
login_error = true;
continue
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
pub(crate) struct Bot {
shared_config: Arc<RwLock<Config>>,
history: SeriesHistory,
run_start_time: DateTime<Utc>
}
enum Wait {
Absolute,
Buffer
}
impl Bot {
pub(crate) fn new() -> Self {
let config = Config::load();
let shared_config: Arc<RwLock<Config>> = Arc::new(RwLock::new(config));
let shared_config_copy = shared_config.clone();
let mut watcher = notify::recommended_watcher(move |res: Result<Event, notify::Error>| {
match res {
Ok(event) => {
if event.kind == EventKind::Access(AccessKind::Close(AccessMode::Write)) {
let mut write = shared_config_copy.write().expect("Write Lock Failed");
let new_config = Config::load();
write.series = new_config.series;
write.instance = new_config.instance;
write.protected_communities = new_config.protected_communities;
write.status_post_url = new_config.status_post_url;
info!("Reloaded Configuration");
}
};
let message = "Communities reloaded".to_owned();
write_info(message);
last_reload = Utc::now();
},
Err(e) => {
let msg = format!("Error watching files: {e}");
error!(msg);
}
}
}
}).expect("Watcher Error");
{
let mut write = data.write().await;
write.post_history = match SeriesHistory::load_history() {
watcher.watch(&Config::get_path(), RecursiveMode::NonRecursive).expect("Error in watcher");
let history: SeriesHistory = SeriesHistory::load_history();
Bot { shared_config, history, run_start_time: Utc::now() }
}
pub(crate) async fn run(&mut self) {
loop {
let mut lemmy = match Lemmy::new(&self.shared_config).await {
Ok(data) => data,
Err(_) => continue,
};
}
{
let read = data.read().await;
let series = read.config.series.clone();
drop(read);
for series in series {
if handle_series(&series, &communities, &lemmy, &data).await.is_err() {
login_error = true;
continue
};
}
}
lemmy.get_communities().await;
self.history = SeriesHistory::load_history();
idle(&data).await;
}
}
async fn idle(data: &Arc<RwLock<SharedData>>) {
let read = data.read().await;
let mut sleep_duration = Duration::seconds(30);
if Utc::now() - read.start > sleep_duration {
sleep_duration = Duration::seconds(60);
}
if let Some(status_url) = read.config.status_post_url.clone() {
match reqwest::get(status_url).await {
Ok(_) => {}
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
},
}
};
while Utc::now() - read.start < sleep_duration {
sleep(Duration::milliseconds(100).to_std().unwrap()).await;
}
}
async fn handle_series(
series: &SeriesConfig,
communities: &HashMap<String, CommunityId>,
lemmy: &Lemmy,
data: &Arc<RwLock<SharedData>>,
) -> Result<(), ()> {
let mut post_list = match jnovel::check_feed(series.slug.as_str(), series.parted).await {
Ok(data) => data,
Err(_) => return Err(()),
};
for (index, post_info) in post_list.clone().iter().enumerate() { // todo .clone() likely not needed
let post_part_info = post_info.get_part_info();
let post_lemmy_info = post_info.get_lemmy_info();
{
let read = data.read().await;
if read.post_history.check_for_post(series.slug.as_str(), post_part_info.as_string().as_str(), post_lemmy_info.title.as_str()) {
let message = format!("Skipping '{}' since already posted", post_lemmy_info.title);
write_info(message);
post_list.remove(index);
continue
}
}
let post_series_config = match post_info {
PostInfo::Chapter {..} => {&series.prepub_community},
PostInfo::Volume {..} => {&series.volume_community}
};
let community_id = *communities
.get(post_series_config.name.as_str())
.expect("Given community is invalid");
let post_body = match &post_series_config.post_body {
PostBody::None => None,
PostBody::Description => post_info.get_description(),
PostBody::Custom(text) => Some(text.clone()),
};
let post_data = CreatePost {
name: post_lemmy_info.title,
community_id,
url: Some(post_lemmy_info.url),
body: post_body,
honeypot: None,
nsfw: None,
language_id: Some(LanguageId(37)), // TODO get this id once every few hours per API request, the ordering of IDs suggests that the EN Id might change in the future
};
let post_id = lemmy.post(post_data).await?;
{
let read = data.read().await;
if post_series_config.pin_settings.pin_new_post_community && !read.config.protected_communities.contains(&post_series_config.name) {
let pinned_posts = lemmy.get_community_pinned(community_id).await?;
if !pinned_posts.is_empty() {
let community_pinned_post = &pinned_posts[0];
lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Community).await?;
let start: DateTime<Utc> = Utc::now();
while Utc::now() - start <= Duration::minutes(60) {
self.run_start_time = Utc::now();
self.ping_status().await;
let read_copy = self.shared_config.read().expect("Read Lock Failed").clone();
for series in read_copy.series {
series.update(&mut self.history, &lemmy, &self.shared_config).await;
debug!("Done Updating Series");
self.wait(1, Wait::Absolute).await;
}
lemmy.pin(post_id, PostFeatureType::Community).await?;
} else if read.config.protected_communities.contains(&post_series_config.name) {
let message = format!("Community '{}' for Series '{}' is protected. Is this intended?", &post_series_config.name, series.slug);
write_warn(message);
debug!("Awaiting Timeout");
self.wait(30, Wait::Buffer).await;
debug!("Pinging Server");
self.ping_status().await;
debug!("Awaiting Timeout 2");
self.wait(30, Wait::Absolute).await;
}
}
let read = data.read().await;
if post_series_config.pin_settings.pin_new_post_local {
let pinned_posts = lemmy.get_local_pinned().await?;
if !pinned_posts.is_empty() {
for pinned_post in pinned_posts {
if read.config.protected_communities.contains(&pinned_post.community.name) {
continue
}
else {
let community_pinned_post = &pinned_post;
lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Local).await?;
break
}
lemmy.logout().await;
}
}
async fn ping_status(&self) {
let read_config = &self.shared_config.read().expect("Read Lock Failed").clone();
if let Some(status_url) = &read_config.status_post_url {
match HTTP_CLIENT.get(status_url).send().await {
Ok(_) => {},
Err(e) => {
let err_msg = format!("While pinging status URL: {e}");
error!(err_msg);
}
}
lemmy.pin(post_id, PostFeatureType::Local).await?;
}
let mut series_history = read.post_history.get_series(series.slug.as_str());
let mut part_history = series_history.get_part(post_part_info.as_string().as_str());
drop(read);
match post_info {
PostInfo::Chapter {..} => {
part_history.chapter = post_info.get_lemmy_info().title
},
PostInfo::Volume {..} => {
part_history.volume = post_info.get_lemmy_info().title
}
}
series_history.set_part(post_part_info.as_string().as_str(), part_history);
let mut write = data.write().await;
write.post_history.set_series(series.slug.as_str(), series_history);
let _ = match write.post_history.save_history() {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
}
Ok(())
async fn wait(&self, seconds: i64, start_time: Wait) {
let duration: Duration = Duration::seconds(seconds);
let start_time: DateTime<Utc> = match start_time {
Wait::Absolute => Utc::now(),
Wait::Buffer => self.run_start_time,
};
while Utc::now() - start_time < duration {
sleep(Duration::milliseconds(100).to_std().unwrap()).await
}
}
}

View file

@ -1,12 +1,57 @@
use lemmy_api_common::sensitive::Sensitive;
use serde_derive::{Deserialize, Serialize};
use std::path::PathBuf;
use std::sync::{Arc, RwLock};
use chrono::{Timelike, Utc};
use crate::config::PostBody::Description;
use lemmy_db_schema::PostFeatureType;
use lemmy_db_schema::sensitive::SensitiveString;
use serde_derive::{Deserialize, Serialize};
use crate::lemmy::{Lemmy, PartInfo, PostType};
use crate::post_history::{SeriesHistory};
use systemd_journal_logger::connected_to_journal;
use crate::fetchers::{FetcherTrait, Fetcher};
use crate::fetchers::jnovel::{JNovelFetcher};
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
macro_rules! warn {
($msg:tt) => {
match connected_to_journal() {
true => log::warn!("[WARN] {}", $msg),
false => println!("[WARN] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub(crate) struct Config {
pub(crate) instance: String,
username: String,
password: String,
username: SensitiveString,
password: SensitiveString,
pub(crate) status_post_url: Option<String>,
pub(crate) config_reload_seconds: u32,
pub(crate) protected_communities: Vec<String>,
@ -40,12 +85,16 @@ impl Config {
cfg
}
pub(crate) fn get_username(&self) -> Sensitive<String> {
Sensitive::new(self.username.clone())
pub(crate) fn get_path() -> PathBuf {
confy::get_configuration_file_path(env!("CARGO_PKG_NAME"), "config").expect("Application will not without confy")
}
pub(crate) fn get_password(&self) -> Sensitive<String> {
Sensitive::new(self.password.clone())
pub(crate) fn get_username(&self) -> SensitiveString {
self.username.clone()
}
pub(crate) fn get_password(&self) -> SensitiveString {
self.password.clone()
}
}
@ -53,12 +102,12 @@ impl Default for Config {
fn default() -> Self {
Config {
instance: "".to_owned(),
username: "".to_owned(),
password: "".to_owned(),
username: SensitiveString::from("".to_owned()),
password: SensitiveString::from("".to_owned()),
status_post_url: None,
config_reload_seconds: 21600,
protected_communities: vec![],
series: vec![]
series: vec![],
}
}
}
@ -69,6 +118,162 @@ pub(crate) struct SeriesConfig {
pub(crate) parted: bool,
pub(crate) prepub_community: PostConfig,
pub(crate) volume_community: PostConfig,
pub(crate) fetcher: Fetcher
}
impl SeriesConfig {
pub(crate) async fn update(&self, history: &mut SeriesHistory, lemmy: &Lemmy, config: &Arc<RwLock<Config>>) {
let info_msg = format!("Checking {} for Updates", self.slug);
info!(info_msg);
let mut fetcher: Fetcher = match &self.fetcher {
Fetcher::Jnc(_) => {
Fetcher::Jnc(JNovelFetcher::new())
},
/*default => {
let err_msg = format!("Fetcher {default} not implemented");
error!(err_msg);
return;
}*/
};
match fetcher {
Fetcher::Jnc(ref mut jnc) => {
jnc.set_series(self.slug.clone());
jnc.set_part_option(self.parted);
}
}
let post_list = match fetcher.check_feed().await {
Ok(data) => data,
Err(_) => {
let err_msg = format!("While checking feed for {}", self.slug);
error!(err_msg);
return;
}
};
if post_list.is_empty() && Utc::now().minute() % 10 == 0 {
let info_msg = "No Updates found";
info!(info_msg);
}
for post_info in post_list.iter() {
if history.check_for_post(
self.slug.as_str(),
post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str(),
post_info.get_info().title.as_str()
) {
continue
}
let post_data = post_info.get_post_data(self, lemmy);
let info = format!(
"Posting '{}' to {}",
post_info.get_info().title.as_str(),
post_info.get_post_config(self).name.as_str()
);
info!(info);
let post_id = match lemmy.post(post_data).await {
Some(data) => data,
None=> {
error!("Error posting chapter");
return;
}
};
let read_config = config.read().expect("Read Lock Failed").clone();
if post_info.get_post_config(self).pin_settings.pin_new_post_community
&& !read_config
.protected_communities
.contains(&post_info.get_post_config(self).name)
{
let info = format!(
"Pinning '{}' to {}",
post_info.get_info().title,
post_info.get_post_config(self).name.as_str()
);
info!(info);
let pinned_posts = lemmy.get_community_pinned(lemmy.get_community_id(&post_info.get_post_config(self).name)).await.unwrap_or_else(|| {
error!("Pinning of Post to community failed");
vec![]
});
if !pinned_posts.is_empty() {
let community_pinned_post = &pinned_posts[0];
if lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Community).await.is_none() {
error!("Error un-pinning post");
}
}
if lemmy.pin(post_id, PostFeatureType::Community).await.is_none() {
error!("Error pinning post");
}
} else if read_config
.protected_communities
.contains(&post_info.get_post_config(self).name)
{
let message = format!(
"Community '{}' for Series '{}' is protected. Is this intended?",
&post_info.get_post_config(self).name, self.slug
);
warn!(message);
}
if post_info.get_post_config(self).pin_settings.pin_new_post_local {
let info = format!("Pinning '{}' to Instance", post_info.get_info().title);
info!(info);
let pinned_posts = match lemmy.get_local_pinned().await {
Some(data) => {data}
None => {
error!("Error fetching pinned posts");
vec![]
}
};
if !pinned_posts.is_empty() {
for pinned_post in pinned_posts {
if read_config
.protected_communities
.contains(&pinned_post.community.name)
{
continue;
} else {
let community_pinned_post = &pinned_post;
if lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Local).await.is_none() {
error!("Error pinning post");
continue;
}
break;
}
}
}
if lemmy.pin(post_id, PostFeatureType::Local).await.is_none() {
error!("Error pinning post");
};
}
let mut series_history = history.get_series(self.slug.as_str());
let mut part_history = series_history.get_part(post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str());
match post_info.post_type {
Some(post_type) => {
match post_type {
PostType::Chapter => part_history.chapter = post_info.get_info().title,
PostType::Volume => part_history.volume = post_info.get_info().title,
}
}
None => part_history.chapter = post_info.get_info().title,
}
series_history.set_part(post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str(), part_history);
history
.set_series(self.slug.as_str(), series_history);
debug!("Saving History");
history.save_history();
}
}
}
#[derive(Debug, Serialize, Deserialize, Clone)]

307
src/fetchers/jnovel.rs Normal file
View file

@ -0,0 +1,307 @@
use crate::{HTTP_CLIENT};
use chrono::{DateTime, Duration, Utc};
use serde_derive::{Deserialize, Serialize};
use std::collections::HashMap;
use std::ops::Sub;
use async_trait::async_trait;
use crate::fetchers::{FetcherTrait};
use crate::lemmy::{PartInfo, PostInfo, PostInfoInner, PostType};
use systemd_journal_logger::connected_to_journal;
use crate::lemmy::PartInfo::{NoParts, Part};
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
static PAST_DAYS_ELIGIBLE: u8 = 4;
macro_rules! api_url {
() => {
"https://labs.j-novel.club/app/v1".to_owned()
};
}
macro_rules! jnc_base_url {
() => {
"https://j-novel.club".to_owned()
};
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct VolumesWrapper {
volumes: Vec<VolumeDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct ChapterWrapper {
parts: Vec<ChapterDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct PaginationInfo {
limit: usize,
skip: usize,
#[serde(alias = "lastPage")]
last_page: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct Cover {
#[serde(alias = "coverUrl")]
pub(crate) cover: String,
#[serde(alias = "thumbnailUrl")]
pub(crate) thumbnail: String,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct VolumeDetail {
pub(crate) title: String,
pub(crate) slug: String,
number: u8,
publishing: String,
#[serde(alias = "shortDescription")]
short_description: String,
cover: Cover,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct ChapterDetail {
pub(crate) title: String,
pub(crate) slug: String,
launch: String,
pub(crate) cover: Option<Cover>,
}
#[derive(Deserialize, Serialize, Debug, Clone)]
pub(crate) struct JNovelFetcher {
series_slug: String,
series_has_parts: bool
}
impl Default for JNovelFetcher {
fn default() -> Self {
Self {
series_slug: "".to_owned(),
series_has_parts: false,
}
}
}
impl JNovelFetcher {
pub(crate) fn set_series(&mut self, series: String) {
self.series_slug = series;
}
pub(crate) fn set_part_option(&mut self, has_parts: bool) {
self.series_has_parts = has_parts;
}
}
#[async_trait]
impl FetcherTrait for JNovelFetcher {
fn new() -> Self {
JNovelFetcher {
series_slug: "".to_owned(),
series_has_parts: false
}
}
async fn check_feed(&self) -> Result<Vec<PostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/series/" + self.series_slug.as_str() + "/volumes?format=json")
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("While checking feed: {e}");
error!(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
}
};
let mut volume_brief_data: VolumesWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
}
};
volume_brief_data.volumes.reverse(); // Makes breaking out of the volume loop easier
// If no parts just use 0 as Part indicator as no Series with Parts has a Part 0
let mut volume_map: HashMap<u8, PostInfo> = HashMap::new();
let mut prepub_map: HashMap<u8, PostInfo> = HashMap::new();
for volume in volume_brief_data.volumes.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&volume.publishing).unwrap();
if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
match self.series_has_parts {
true => continue,
false => break,
}
}
let new_part_info: PartInfo;
if self.series_has_parts {
let mut part_number: Option<u8> = None;
let splits: Vec<&str> = volume.slug.split('-').collect();
for (index, split) in splits.clone().into_iter().enumerate() {
if split == "part" {
part_number = Some(
splits[index + 1]
.parse::<u8>()
.expect("Split Element after 'Part' should always be a number"),
);
break;
}
}
match part_number {
Some(number) => new_part_info = Part(number),
None => {
info!("No Part found, assuming 1");
new_part_info = Part(1);
}
}
} else {
new_part_info = NoParts;
}
let post_url = format!(
"{}/series/{}#volume-{}",
jnc_base_url!(),
self.series_slug.as_str(),
volume.number
);
let post_details = PostInfoInner {
title: volume.title.clone(),
url: post_url.clone(),
thumbnail: Some(volume.cover.thumbnail.clone())
};
let new_post_info = PostInfo {
post_type: Some(PostType::Volume),
part: Some(new_part_info),
description: Some(volume.short_description.clone()),
lemmy_info: post_details,
};
let part_id = new_part_info.as_u8();
if publishing_date <= Utc::now() {
volume_map
.entry(part_id)
.and_modify(|val| {
if *val < new_post_info {
*val = new_post_info.clone()
}
})
.or_insert(new_post_info);
}
if let Some(prepub_info) = get_latest_prepub(&volume.slug).await {
let prepub_post_info = PostInfo {
post_type: Some(PostType::Chapter),
part: Some(new_part_info),
lemmy_info: prepub_info,
description: None,
};
prepub_map
.entry(part_id)
.and_modify(|val| {
if *val < prepub_post_info {
*val = prepub_post_info.clone()
}
})
.or_insert(prepub_post_info);
}
}
let mut result_vec: Vec<PostInfo> = volume_map.values().cloned().collect();
let mut prepub_vec: Vec<PostInfo> = prepub_map.values().cloned().collect();
result_vec.append(&mut prepub_vec);
Ok(result_vec)
}
}
async fn get_latest_prepub(volume_slug: &str) -> Option<PostInfoInner> {
let response = match HTTP_CLIENT
.get(api_url!() + "/volumes/" + volume_slug + "/parts?format=json")
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("While getting latest PrePub: {e}");
error!(err_msg);
return None;
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return None;
}
};
let mut volume_prepub_parts_data: ChapterWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return None;
}
};
volume_prepub_parts_data.parts.reverse(); // Makes breaking out of the parts loop easier
let mut post_details: Option<PostInfoInner> = None;
for prepub_part in volume_prepub_parts_data.parts.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&prepub_part.launch).unwrap();
if publishing_date > Utc::now() {
break;
} else if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
continue;
}
let thumbnail = prepub_part.cover.as_ref().map(|cover| cover.thumbnail.clone());
let post_url = format!("{}/read/{}", jnc_base_url!(), prepub_part.slug);
post_details = Some(PostInfoInner {
title: prepub_part.title.clone(),
url: post_url.clone(),
thumbnail
});
}
post_details
}

33
src/fetchers/mod.rs Normal file
View file

@ -0,0 +1,33 @@
use async_trait::async_trait;
use serde_derive::{Deserialize, Serialize};
use strum_macros::Display;
use crate::fetchers::Fetcher::Jnc;
use crate::fetchers::jnovel::JNovelFetcher;
use crate::lemmy::{PostInfo};
pub mod jnovel;
#[async_trait]
pub(crate) trait FetcherTrait {
fn new() -> Self where Self: Sized;
async fn check_feed(&self) -> Result<Vec<PostInfo>, ()>;
}
impl Fetcher {
pub(crate) async fn check_feed(&self) -> Result<Vec<PostInfo>, ()> {
match self {
Jnc(fetcher) => fetcher.check_feed().await,
/*default => {
let err_msg = format!("Fetcher {default} is not implemented");
error!(err_msg);
Err(())
}*/
}
}
}
#[derive(Deserialize, Serialize, Debug, Clone, Display)]
pub(crate) enum Fetcher {
#[serde(rename = "jnc")]
Jnc(#[serde(skip)] JNovelFetcher)
}

View file

@ -1,398 +0,0 @@
use std::cmp::Ordering;
use std::collections::HashMap;
use std::ops::Sub;
use chrono::{DateTime, Duration, Utc};
use serde_derive::{Deserialize, Serialize};
use url::Url;
use crate::{HTTP_CLIENT, write_error};
use crate::jnovel::PartInfo::{NoParts, Part};
use crate::jnovel::PostInfo::{Chapter, Volume};
static PAST_DAYS_ELIGIBLE: u8 = 4;
macro_rules! api_url {
() => {
"https://labs.j-novel.club/app/v1".to_owned()
};
}
macro_rules! jnc_base_url {
() => {
"https://j-novel.club".to_owned()
};
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct VolumesWrapper {
volumes: Vec<VolumeDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct ChapterWrapper {
parts: Vec<ChapterDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct PaginationInfo {
limit: usize,
skip: usize,
#[serde(alias = "lastPage")]
last_page: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct Cover {
#[serde(alias = "coverUrl")]
pub(crate) cover: String,
#[serde(alias = "thumbnailUrl")]
pub(crate) thumbnail: String,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct VolumeDetail {
pub(crate) title: String,
pub(crate) slug: String,
pub(crate) number: u8,
pub(crate) publishing: String,
#[serde(alias = "shortDescription")]
pub(crate) short_description: String,
pub(crate) cover: Cover,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct ChapterDetail {
pub(crate) title: String,
pub(crate) slug: String,
pub(crate) launch: String,
pub(crate) cover: Option<Cover>,
}
#[derive(Debug, Clone)]
pub(crate) struct LemmyPostInfo {
pub(crate) title: String,
pub(crate) url: Url,
}
#[derive(Debug, Copy, Clone)]
pub(crate) enum PartInfo {
NoParts,
Part(u8),
}
impl PartInfo {
pub(crate) fn as_u8(&self) -> u8 {
match self {
Part(number) => *number,
NoParts => 0,
}
}
pub(crate) fn as_string(&self) -> String {
self.as_u8().to_string()
}
}
impl PartialEq for PartInfo {
fn eq(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric == other_numeric
}
}
impl PartialOrd for PartInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) { Some(Ordering::Greater) }
else if self.eq(other) { Some(Ordering::Equal) }
else { Some(Ordering::Less) }
}
fn lt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric < other_numeric
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric > other_numeric
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
#[derive(Debug, Clone)]
pub(crate) enum PostInfo {
Chapter { part: PartInfo, lemmy_info: LemmyPostInfo },
Volume { part: PartInfo, description: String, lemmy_info: LemmyPostInfo },
}
impl PostInfo {
pub(crate) fn get_part_info(&self) -> PartInfo {
match self {
Chapter {part: part_info, ..} => *part_info,
Volume {part: part_info, ..} => *part_info
}
}
pub(crate) fn get_lemmy_info(&self) -> LemmyPostInfo {
match self {
Chapter {lemmy_info, ..} => lemmy_info.clone(),
Volume {lemmy_info, ..} => lemmy_info.clone()
}
}
pub(crate) fn get_description(&self) -> Option<String> {
match self {
Chapter {..} => None,
Volume {description, ..} => Some(description.clone()),
}
}
}
impl PartialEq for PostInfo {
fn eq(&self, other: &Self) -> bool {
let self_part = match self {
Chapter {part, ..} => part,
Volume {part, ..} => part,
};
let other_part = match other {
Chapter {part, ..} => part,
Volume {part, ..} => part,
};
self_part.eq(other_part)
}
}
impl PartialOrd for PostInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) { Some(Ordering::Greater) }
else if self.eq(other) { Some(Ordering::Equal) }
else { Some(Ordering::Less) }
}
fn lt(&self, other: &Self) -> bool {
let self_part = match self {
Chapter {part, ..} => part,
Volume {part, ..} => part,
};
let other_part = match other {
Chapter {part, ..} => part,
Volume {part, ..} => part,
};
self_part < other_part
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_part = match self {
Chapter {part, ..} => part,
Volume {part, ..} => part,
};
let other_part = match other {
Chapter {part, ..} => part,
Volume {part, ..} => part,
};
self_part > other_part
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
pub(crate) async fn check_feed(series_slug: &str, series_has_parts: bool) -> Result<Vec<PostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/series/" + series_slug + "/volumes?format=json")
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
let mut volume_brief_data: VolumesWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
volume_brief_data.volumes.reverse(); // Makes breaking out of the volume loop easier
// If no parts just use 0 as Part indicator as no Series with Parts has a Part 0
let mut volume_map: HashMap<u8, PostInfo> = HashMap::new();
let mut prepub_map: HashMap<u8, PostInfo> = HashMap::new();
for volume in volume_brief_data.volumes.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&volume.publishing).unwrap();
if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
match series_has_parts {
true => continue,
false => break,
}
}
let new_part_info: PartInfo;
if series_has_parts {
let mut part_number: Option<u8> = None;
let splits: Vec<&str> = volume.slug.split('-').collect();
for (index, split) in splits.clone().into_iter().enumerate() {
if split == "part" {
part_number = Some(splits[index +1]
.parse::<u8>()
.expect("Split Element after 'Part' should always be a number"));
break;
}
}
match part_number {
Some(number) => new_part_info = Part(number),
None => {
println!("No Part found, assuming 1");
new_part_info = Part(1);
},
}
}
else {
new_part_info = NoParts;
}
let post_url = format!("{}/series/{series_slug}#volume-{}", jnc_base_url!(), volume.number);
let post_details = LemmyPostInfo {
title: volume.title.clone(),
url: Url::parse(&post_url).unwrap(),
};
let new_post_info = Volume {
part: new_part_info,
description: volume.short_description.clone(),
lemmy_info: post_details,
};
let part_id = new_part_info.as_u8();
if publishing_date <= Utc::now() {
volume_map
.entry(part_id)
.and_modify(|val| {
if *val < new_post_info {
*val = new_post_info.clone()
}
})
.or_insert(new_post_info);
}
if let Some(prepub_info) = get_latest_prepub(&volume.slug).await? {
let prepub_post_info = Chapter {
part: new_part_info,
lemmy_info: prepub_info
};
prepub_map
.entry(part_id)
.and_modify(|val| {
if *val < prepub_post_info {
*val = prepub_post_info.clone()
}
})
.or_insert(prepub_post_info);
}
}
let mut result_vec: Vec<PostInfo> = volume_map.values().cloned().collect();
let mut prepub_vec: Vec<PostInfo> = prepub_map.values().cloned().collect();
result_vec.append(&mut prepub_vec);
Ok(result_vec)
}
async fn get_latest_prepub(volume_slug: &str) -> Result<Option<LemmyPostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/volumes/" + volume_slug + "/parts?format=json")
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
let mut volume_prepub_parts_data: ChapterWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
volume_prepub_parts_data.parts.reverse(); // Makes breaking out of the parts loop easier
let mut post_details: Option<LemmyPostInfo> = None;
for prepub_part in volume_prepub_parts_data.parts.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&prepub_part.launch).unwrap();
if publishing_date > Utc::now() {
break
}
else if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
continue
}
let post_url = format!("{}/read/{}", jnc_base_url!(), prepub_part.slug);
post_details = Some(LemmyPostInfo {
title: prepub_part.title.clone(),
url: Url::parse(&post_url).unwrap(),
});
}
Ok(post_details)
}

View file

@ -1,137 +1,295 @@
use std::collections::HashMap;
use std::cmp::Ordering;
use crate::config::{Config, PostBody, PostConfig, SeriesConfig};
use crate::{HTTP_CLIENT};
use lemmy_api_common::community::{ListCommunities, ListCommunitiesResponse};
use lemmy_api_common::lemmy_db_views::structs::PostView;
use lemmy_api_common::person::{Login, LoginResponse};
use lemmy_api_common::post::{CreatePost, FeaturePost, GetPosts, GetPostsResponse};
use lemmy_api_common::sensitive::Sensitive;
use lemmy_db_schema::newtypes::{CommunityId, PostId};
use lemmy_db_schema::newtypes::{CommunityId, LanguageId, PostId};
use lemmy_db_schema::{ListingType, PostFeatureType};
use reqwest::StatusCode;
use crate::config::Config;
use crate::{HTTP_CLIENT, write_error};
use std::collections::HashMap;
use std::sync::{RwLock};
use lemmy_db_schema::sensitive::SensitiveString;
use serde::{Deserialize, Serialize};
use systemd_journal_logger::connected_to_journal;
pub(crate) struct Lemmy {
jwt_token: Sensitive<String>,
instance: String,
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
pub(crate) async fn login(config: &Config) -> Result<Lemmy, ()> {
let login_params = Login {
username_or_email: config.get_username(),
password: config.get_password(),
totp_2fa_token: None,
};
let response = match HTTP_CLIENT
.post(config.instance.to_owned() + "/api/v3/user/login")
.json(&login_params)
.send()
.await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
match response.status() {
StatusCode::OK => {
let data: LoginResponse = response.json().await.expect("Successful Login Request should return JSON");
match data.jwt {
Some(token) => Ok(Lemmy {
jwt_token: token.clone(),
instance: config.instance.to_owned(),
}),
None => {
let err_msg = "Login did not return JWT token. Are the credentials valid?".to_owned();
write_error(err_msg);
Err(())
pub(crate) struct Lemmy {
jwt_token: SensitiveString,
instance: String,
communities: HashMap<String, CommunityId>,
}
#[derive(Debug, Clone)]
pub(crate) struct PostInfoInner {
pub(crate) title: String,
pub(crate) url: String,
pub(crate) thumbnail: Option<String>
}
#[derive(Debug, Copy, Clone)]
pub(crate) enum PartInfo {
NoParts,
Part(u8),
}
impl PartInfo {
pub(crate) fn as_u8(&self) -> u8 {
match self {
PartInfo::Part(number) => *number,
PartInfo::NoParts => 0,
}
}
pub(crate) fn as_string(&self) -> String {
self.as_u8().to_string()
}
}
impl PartialEq for PartInfo {
fn eq(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric == other_numeric
}
}
impl PartialOrd for PartInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric < other_numeric
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric > other_numeric
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
#[derive(Debug, Clone, Copy)]
pub(crate) enum PostType {
Chapter,
Volume
}
#[derive(Debug, Clone)]
pub(crate) struct PostInfo {
pub(crate) part: Option<PartInfo>,
pub(crate) lemmy_info: PostInfoInner,
pub(crate) description: Option<String>,
pub(crate) post_type: Option<PostType>
}
impl PostInfo {
pub(crate)fn get_info(&self) -> PostInfoInner {
self.lemmy_info.clone()
}
pub(crate)fn get_description(&self) -> Option<String> {
self.description.clone()
}
pub(crate) fn get_part_info(&self) -> Option<PartInfo> {
self.part
}
pub(crate) fn get_post_config(&self, series: &SeriesConfig) -> PostConfig {
match self.post_type {
Some(post_type) => {
match post_type {
PostType::Chapter => series.prepub_community.clone(),
PostType::Volume => series.volume_community.clone(),
}
}
},
status => {
let err_msg = format!("Unexpected HTTP Status '{}' during Login", status);
write_error(err_msg);
Err(())
None => series.prepub_community.clone(),
}
}
pub(crate) fn get_post_data(&self, series: &SeriesConfig, lemmy: &Lemmy) -> CreatePost {
let post_config = self.get_post_config(series);
let post_body = match &post_config.post_body {
PostBody::None => None,
PostBody::Description => self.get_description(),
PostBody::Custom(text) => Some(text.clone()),
};
let community_id: CommunityId = lemmy.get_community_id(&post_config.name);
CreatePost {
name: self.get_info().title.clone(),
community_id,
url: Some(self.get_info().url),
custom_thumbnail: self.get_info().thumbnail,
body: post_body,
alt_text: None,
honeypot: None,
nsfw: None,
language_id: Some(LanguageId(37)), // TODO get this id once every few hours per API request, the ordering of IDs suggests that the EN Id might change in the future
}
}
}
impl PartialEq for PostInfo {
fn eq(&self, other: &Self) -> bool {
self.part.eq(&other.part)
}
}
impl PartialOrd for PostInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
self.part < other.part
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
self.part > other.part
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
impl Lemmy {
pub(crate) async fn post(&self, post: CreatePost) -> Result<PostId, ()> {
pub(crate) fn get_community_id(&self, name: &str) -> CommunityId {
*self.communities.get(name).expect("Given community is invalid")
}
pub(crate) async fn new(config: &RwLock<Config>) -> Result<Self, ()> {
let read_config = config.read().expect("Read Lock Failed").clone();
let login_params = Login {
username_or_email: read_config.get_username(),
password: read_config.get_password(),
totp_2fa_token: None,
};
let response = match HTTP_CLIENT
.post(format!("{}/api/v3/post", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.json(&post)
.post(read_config.instance.to_owned() + "/api/v3/user/login")
.json(&login_params)
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
.await
{
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
}
};
match response.status() {
StatusCode::OK => {
let data: LoginResponse = response
.json()
.await
.expect("Successful Login Request should return JSON");
match data.jwt {
Some(token) => Ok(Lemmy {
jwt_token: token.clone(),
instance: read_config.instance.to_owned(),
communities: HashMap::new(),
}),
None => {
let err_msg = "Login did not return JWT token. Are the credentials valid?".to_owned();
error!(err_msg);
Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
let json_data = match serde_json::from_str::<HashMap<&str, PostView>>(&response) {
Ok(mut data) => data.remove("post_view").expect("Element should be present"),
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
status => {
let err_msg = format!("Unexpected HTTP Status '{}' during Login", status);
error!(err_msg);
Err(())
}
};
Ok(json_data.post.id)
}
}
async fn feature(&self, params: FeaturePost) -> Result<PostView, ()> {
let response = match HTTP_CLIENT
.post(format!("{}/api/v3/post/feature", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.json(&params)
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
let json_data = match serde_json::from_str::<HashMap<&str, PostView>>(&response) {
Ok(mut data) => data.remove("post_view").expect("Element should be present"),
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
Ok(json_data)
pub(crate) async fn logout(&self) {
let _ = self.post_data_json("/api/v3/user/logout", &"").await;
}
pub(crate) async fn unpin(&self, post_id: PostId, location: PostFeatureType) -> Result<PostView, ()> {
pub(crate) async fn post(&self, post: CreatePost) -> Option<PostId> {
let response: String = match self.post_data_json("/api/v3/post", &post).await {
Some(data) => data,
None => return None,
};
let json_data: PostView = match self.parse_json_map(&response).await {
Some(data) => data,
None => return None,
};
Some(json_data.post.id)
}
async fn feature(&self, params: FeaturePost) -> Option<PostView> {
let response: String = match self.post_data_json("/api/v3/post/feature", &params).await {
Some(data) => data,
None => return None,
};
let json_data: PostView = match self.parse_json_map(&response).await {
Some(data) => data,
None => return None,
};
Some(json_data)
}
pub(crate) async fn unpin(&self, post_id: PostId, location: PostFeatureType) -> Option<PostView> {
let pin_params = FeaturePost {
post_id,
featured: false,
@ -140,7 +298,7 @@ impl Lemmy {
self.feature(pin_params).await
}
pub(crate) async fn pin(&self, post_id: PostId, location: PostFeatureType) -> Result<PostView, ()> {
pub(crate) async fn pin(&self, post_id: PostId, location: PostFeatureType) -> Option<PostView> {
let pin_params = FeaturePost {
post_id,
featured: true,
@ -149,135 +307,66 @@ impl Lemmy {
self.feature(pin_params).await
}
pub(crate) async fn get_community_pinned(&self, community: CommunityId) -> Result<Vec<PostView>, ()> {
pub(crate) async fn get_community_pinned(&self, community: CommunityId) -> Option<Vec<PostView>> {
let list_params = GetPosts {
community_id: Some(community),
type_: Some(ListingType::Local),
..Default::default()
};
let response = match HTTP_CLIENT
.get(format!("{}/api/v3/post/list", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.query(&list_params)
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
let response: String = match self.get_data_query("/api/v3/post/list", &list_params).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = match self.parse_json(&response).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
Ok(json_data.posts.iter().filter(|post| {
post.post.featured_community
})
Some(json_data
.posts
.iter()
.filter(|post| post.post.featured_community)
.cloned()
.collect()
)
.collect())
}
pub(crate) async fn get_local_pinned(&self) -> Result<Vec<PostView>, ()> {
pub(crate) async fn get_local_pinned(&self) -> Option<Vec<PostView>> {
let list_params = GetPosts {
type_: Some(ListingType::Local),
..Default::default()
};
let response = match HTTP_CLIENT
.get(format!("{}/api/v3/post/list", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.query(&list_params)
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
let response: String = match self.get_data_query("/api/v3/post/list", &list_params).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = match self.parse_json(&response).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
};
Ok(json_data.posts.iter().filter(|post| {
post.post.featured_local
})
Some(json_data
.posts
.iter()
.filter(|post| post.post.featured_local)
.cloned()
.collect()
)
.collect())
}
pub(crate) async fn get_communities(&self) -> Result<HashMap<String, CommunityId>, ()> {
pub(crate) async fn get_communities(&mut self) {
let list_params = ListCommunities {
type_: Some(ListingType::Local),
..Default::default()
};
let response = match HTTP_CLIENT
.get(format!("{}/api/v3/community/list", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.query(&list_params)
.send()
.await {
Ok(data) => {
match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
let response: String = match self.get_data_query("/api/v3/community/list", &list_params).await {
Some(data) => data,
None => return,
};
let json_data: ListCommunitiesResponse = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
}
let json_data: ListCommunitiesResponse = match self.parse_json::<ListCommunitiesResponse>(&response).await {
Some(data) => data,
None => return,
};
let mut communities: HashMap<String, CommunityId> = HashMap::new();
@ -286,6 +375,76 @@ impl Lemmy {
communities.insert(community.name, community.id);
}
Ok(communities)
self.communities = communities;
}
async fn post_data_json<T: Serialize>(&self, route: &str, json: &T ) -> Option<String> {
let res = HTTP_CLIENT
.post(format!("{}{route}", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.json(&json)
.send()
.await;
self.extract_data(res).await
}
async fn get_data_query<T: Serialize>(&self, route: &str, param: &T ) -> Option<String> {
let res = HTTP_CLIENT
.get(format!("{}{route}", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.query(&param)
.send()
.await;
self.extract_data(res).await
}
async fn extract_data(&self, response: Result<reqwest::Response, reqwest::Error>) -> Option<String> {
match response {
Ok(data) => {
if data.status().is_success() {
match data.text().await {
Ok(data) => Some(data),
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
None
}
}
}
else {
let err_msg = format!("HTTP Request failed: {}", data.text().await.unwrap());
error!(err_msg);
None
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
None
}
}
}
async fn parse_json<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Option<T> {
match serde_json::from_str::<T>(response) {
Ok(data) => Some(data),
Err(e) => {
let err_msg = format!("while parsing JSON: {e} ");
error!(err_msg);
None
}
}
}
async fn parse_json_map<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Option<T> {
debug!(response);
match serde_json::from_str::<HashMap<&str, T>>(response) {
Ok(mut data) => Some(data.remove("post_view").expect("Element should be present")),
Err(e) => {
let err_msg = format!("while parsing JSON HashMap: {e}");
error!(err_msg);
None
}
}
}
}

View file

@ -1,96 +1,41 @@
use chrono::{DateTime, Duration, Utc};
use chrono::{Duration};
use log::{LevelFilter};
use once_cell::sync::Lazy;
use reqwest::{Client};
use std::{collections::HashMap};
use std::fmt::Debug;
use std::sync::{Arc};
use log::{error, warn, info, LevelFilter};
use tokio::sync::{RwLock};
use systemd_journal_logger::{connected_to_journal, JournalLog};
use tokio::time::sleep;
use crate::config::Config;
use crate::post_history::{SeriesHistory};
use reqwest::Client;
use systemd_journal_logger::{JournalLog};
use crate::bot::Bot;
mod config;
mod jnovel;
mod bot;
mod config;
mod lemmy;
mod post_history;
pub (crate) fn write_error(err_msg: String) {
match connected_to_journal() {
true => error!("[ERROR] {err_msg}"),
false => println!("[ERROR] {err_msg}"),
}
}
pub (crate) fn write_warn(warn_msg: String) {
match connected_to_journal() {
true => warn!("[WARN] {warn_msg}"),
false => println!("[WARN] {warn_msg}"),
}
}
pub (crate) fn write_info(info_msg: String) {
match connected_to_journal() {
true => info!("[INFO] {info_msg}"),
false => println!("[INFO] {info_msg}"),
}
}
mod fetchers;
pub static HTTP_CLIENT: Lazy<Client> = Lazy::new(|| {
Client::builder()
.timeout(Duration::seconds(30).to_std().unwrap())
.connect_timeout(Duration::seconds(30).to_std().unwrap())
.timeout(Duration::seconds(10).to_std().unwrap())
.connect_timeout(Duration::seconds(10).to_std().unwrap())
.build()
.expect("build client")
});
#[derive(Clone, Debug)]
pub(crate) struct SharedData {
config: Config,
post_history: SeriesHistory,
start: DateTime<Utc>,
}
impl SharedData {
pub(crate) fn new() -> Self {
SharedData {
config: Config::default(),
post_history: SeriesHistory {
series: HashMap::new(),
},
start: Utc::now(),
}
}
}
#[tokio::main]
async fn main() {
JournalLog::new().expect("Systemd-Logger crate error").install().expect("Systemd-Logger crate error");
log::set_max_level(LevelFilter::Info);
let mut data = SharedData::new();
loop {
let write_data = Arc::new(RwLock::new(data.clone()));
//let read_data = write_data.clone();
let persistent_data = write_data.clone();
let bot_thread = tokio::spawn(async move { bot::run(write_data).await });
let _ = bot_thread.await;
data = persistent_data.read().await.clone();
{
let err_msg = "Bot crashed due to unknown Error, restarting thread after wait...";
match connected_to_journal() {
true => error!("[ERROR] {err_msg}"),
false => println!("[ERROR] {err_msg}")
JournalLog::new()
.expect("Systemd-Logger crate error")
.install()
.expect("Systemd-Logger crate error");
match std::env::var("LOG_LEVEL") {
Ok(level) => {
match level.as_str() {
"debug" => log::set_max_level(LevelFilter::Debug),
"info" => log::set_max_level(LevelFilter::Info),
_ => log::set_max_level(LevelFilter::Info),
}
}
sleep(Duration::seconds(5).to_std().expect("Conversion should always work since static")).await;
_ => log::set_max_level(LevelFilter::Info),
}
let mut bot = Bot::new();
bot.run().await;
}

View file

@ -1,75 +1,53 @@
use std::collections::HashMap;
use std::fs;
use std::fs::OpenOptions;
use std::io::Write;
use std::path::Path;
use serde_derive::{Deserialize, Serialize};
use crate::write_error;
use std::collections::HashMap;
use systemd_journal_logger::connected_to_journal;
#[derive(Serialize, Deserialize, Clone, Debug)]
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
#[derive(Serialize, Deserialize, Default, Clone, Debug)]
pub(crate) struct SeriesHistory {
pub(crate) series: HashMap<String, PostHistory>,
}
impl SeriesHistory {
pub(crate) fn load_history() -> Result<Self, ()> {
let path = confy::get_configuration_file_path(env!("CARGO_PKG_NAME"), "config").expect("Something went wrong with confy");
let config_dir = path.parent().expect("Something went wrong with confy");
let path = format!("{}/history.toml", config_dir.to_str().expect("Conversion to str should not fail for a dir"));
match Path::new(path.as_str()).exists() {
true => {
let file_contents: String = match fs::read_to_string(path.as_str()) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(())
},
};
let history: Result<SeriesHistory, toml::de::Error> = match file_contents.len() {
0 => return Ok(SeriesHistory {
series: HashMap::new(),
}),
_ => toml::from_str(file_contents.as_str()),
};
match history {
Ok(data) => Ok(data),
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
Err(())
}
}
},
false => {
Ok(SeriesHistory {
series: HashMap::new(),
})
}
pub(crate) fn load_history() -> Self {
let info_msg = "Loading History";
info!(info_msg);
match confy::load(env!("CARGO_PKG_NAME"), "history") {
Ok(data) => data,
Err(e) => panic!("history.toml not found: {e}"),
}
}
pub(crate) fn save_history(&self) -> std::io::Result<usize> {
let mut file = OpenOptions::new()
.read(true)
.write(true)
.create(true)
.open("history.toml")
.unwrap();
let json_data = toml::to_string_pretty(&self).unwrap();
file.write(json_data.as_bytes())
pub(crate) fn save_history(&self) {
let info_msg = "Saving History";
info!(info_msg);
if let Err(e) = confy::store(env!("CARGO_PKG_NAME"), "history", self) {
let err_msg = format!("Unexpected error saving to history.toml: {e}");
error!(err_msg);
}
}
pub(crate) fn check_for_post(&self, series: &str, part: &str, title: &str) -> bool {
if let Some(series_map) = self.series.get(series) {
if let Some(part_info) = series_map.parts.get(part) {
return part_info.volume == title || part_info.chapter == title
return part_info.volume == title || part_info.chapter == title;
}
}
false
@ -79,15 +57,15 @@ impl SeriesHistory {
match self.series.get(series) {
Some(history) => history.clone(),
None => PostHistory {
parts: HashMap::new()
}
parts: HashMap::new(),
},
}
}
pub(crate) fn set_series(&mut self, series: &str, data: PostHistory) {
self.series.entry(series.to_owned()).and_modify(|val| {
*val = data.clone()
})
self.series
.entry(series.to_owned())
.and_modify(|val| *val = data.clone())
.or_insert(data);
}
}
@ -104,14 +82,14 @@ impl PostHistory {
None => PostHistoryInner {
volume: "".to_owned(),
chapter: "".to_owned(),
}
},
}
}
pub(crate) fn set_part(&mut self, part: &str, data: PostHistoryInner) {
self.parts.entry(part.to_owned()).and_modify(|val| {
*val = data.clone()
})
self.parts
.entry(part.to_owned())
.and_modify(|val| *val = data.clone())
.or_insert(data);
}
}