Compare commits

...

46 commits
2.2.0 ... main

Author SHA1 Message Date
041590a559
Release 3.2.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 39s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 6s
2024-10-22 16:15:19 +02:00
d6f883f890
Bump API version
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
2024-10-22 16:14:46 +02:00
b07420e0bd Merge pull request 'Bump forgejo-release to v2' (#26) from actions-update into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 19s
Reviewed-on: https://forgejo.neshweb.net///Neshura/aob-lemmy-bot/pulls/26
2024-08-06 12:23:26 +00:00
0fc71f0a7d Bump forgejo-release to v2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 1m7s
Build binary file and bundle packages / test (pull_request) Successful in 1m6s
Build binary file and bundle packages / build (pull_request) Successful in 1m10s
2024-08-06 12:19:37 +00:00
2ae6468ad8
Release 3.1.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 47s
Build and Release Binary File / test (push) Successful in 16s
Build and Release Binary File / build (push) Successful in 48s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 37s
2024-07-15 22:15:49 +02:00
2ecfe88cb9
Update to 0.19.5 and include optional thumbnail 2024-07-15 22:15:31 +02:00
7dcc7bfee2
Release 3.0.3
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-08 16:34:48 +02:00
94d8a4e673
Add Timeout to Status Ping HTTP Request
All checks were successful
Run Tests on Code / run-tests (push) Successful in 14s
2024-05-08 16:34:32 +02:00
1b585eab7e
Release 3.0.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 37s
Build and Release Binary File / test (push) Successful in 37s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-07 23:50:14 +02:00
b6f5c38e4a
Overhaul Error handling (Option instead of Result<T, ()> + Logging changes
All checks were successful
Run Tests on Code / run-tests (push) Successful in 15s
2024-05-07 23:49:55 +02:00
5d708bdb82
Release 3.0.1
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 34s
Build and Release Binary File / build (push) Successful in 38s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:50:22 +02:00
6a8c1662f0
Fix enum problems in config
All checks were successful
Run Tests on Code / run-tests (push) Successful in 16s
2024-05-07 22:50:12 +02:00
e02cd900ed
Release 3.0.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 41s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:35:15 +02:00
32ea83a7bb
Release Candidate 3.0.0-rc.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:30:02 +02:00
4297860b9e
Legacy fixes for async traits
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
2024-05-07 22:29:48 +02:00
affe62b973
Release Candidate 3.0.0-rc.1
Some checks failed
Build and Release Binary File / test (push) Failing after 27s
Run Tests on Code / run-tests (push) Failing after 39s
Build and Release Binary File / build (push) Has been skipped
Build and Release Binary File / upload-generic-package (push) Has been skipped
Build and Release Binary File / upload-debian-package (push) Has been skipped
Build and Release Binary File / upload-release (push) Has been skipped
2024-05-07 22:11:16 +02:00
6520cc65a3
Rewrite Version 3
Some checks failed
Run Tests on Code / run-tests (push) Failing after 36s
2024-05-07 22:10:21 +02:00
aefceda628
Release 2.2.8
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-06 22:57:55 +02:00
ee5a159431
Yet another another logging fix
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-05-06 22:57:41 +02:00
85f8b97607
Yet another logging fix
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-05-06 22:57:05 +02:00
966dd8f359
Release 2.2.7
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 31s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-06 22:54:15 +02:00
17e161bc27
Small fixes to logging 2024-05-06 22:54:01 +02:00
3928367692
Release 2.2.6
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 22:36:33 +02:00
070eae961a
Yet more logging and delay between series handling requests (should avoid potential rate limit)
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 22:36:21 +02:00
92103e28ba
Release 2.2.5
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 22:21:14 +02:00
cd78d3c1c7
Reduced connection timeouts and added some logging
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 22:21:04 +02:00
22bbaaa002
Release 2.2.4
All checks were successful
Run Tests on Code / run-tests (push) Successful in 36s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 11s
2024-05-06 21:53:14 +02:00
9ee9db5792
Dependency bumps
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 21:53:02 +02:00
23ac0de189
Release 2.2.3
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 11s
2024-05-06 21:28:00 +02:00
2dc695577e
Further Bugfixing 2024-05-06 21:27:47 +02:00
534a8022a9
Release 2.2.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 38s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 21:17:08 +02:00
45bfca8cc5
Bugfix due to bad deduplication
All checks were successful
Run Tests on Code / run-tests (push) Successful in 13s
2024-05-06 21:16:53 +02:00
34b3bb45c5
Release 2.2.1
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 13s
2024-05-06 21:06:17 +02:00
3e78ce5bf6
Merge pull request 'Various Fixes and Changes' (#21) from hotfix-1 into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 16s
2024-05-06 21:05:07 +02:00
fc4ce74567
Clippy changes
All checks were successful
Build binary file and bundle packages / test (pull_request) Successful in 36s
Run Tests on Code / run-tests (push) Successful in 38s
Build binary file and bundle packages / build (pull_request) Successful in 36s
2024-05-06 21:00:55 +02:00
85f3044224
Lemmy API Crate bump 2024-05-06 20:57:35 +02:00
d50bf01db6
Fix erroneous print statement to use logger 2024-05-06 20:57:32 +02:00
c8d7053b87
De-duplicate Code 2024-05-06 20:57:29 +02:00
962d90fe1d
Remove unnecessary thread structure 2024-05-06 20:57:26 +02:00
36b59240d9
Use macros over functions for error logging 2024-05-06 20:57:20 +02:00
1cd30b1145
Add async-trait crate
Some checks failed
Run Tests on Code / run-tests (push) Failing after 31s
2024-01-08 21:07:49 +01:00
167fcdb7ad
Import fetchers instead of jnovel module in main 2024-01-08 21:07:36 +01:00
b9a26a7b1c
Partially adapt bot module to changes due to fetchers modularization 2024-01-08 21:07:17 +01:00
e5862ba0ec
Move relevant structs to lemmy module 2024-01-08 21:06:52 +01:00
6bd7369ecc
Move jnovel module into fetchers module 2024-01-08 21:06:25 +01:00
ba3110da0e
Add fetchers trait module 2024-01-08 21:06:18 +01:00
11 changed files with 1705 additions and 1111 deletions

View file

@ -137,7 +137,7 @@ jobs:
run: rm release_blobs/build.env run: rm release_blobs/build.env
- -
name: Release New Version name: Release New Version
uses: actions/forgejo-release@v1 uses: actions/forgejo-release@v2
with: with:
direction: upload direction: upload
url: https://forgejo.neshweb.net url: https://forgejo.neshweb.net

738
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,7 +1,7 @@
[package] [package]
authors = ["Neshura"] authors = ["Neshura"]
name = "aob-lemmy-bot" name = "aob-lemmy-bot"
version = "2.2.0" version = "3.2.0"
edition = "2021" edition = "2021"
description = "Bot for automatically posting new chapters of 'Ascendance of a Bookworm' released by J-Novel Club" description = "Bot for automatically posting new chapters of 'Ascendance of a Bookworm' released by J-Novel Club"
license = "GPL-3.0-or-later" license = "GPL-3.0-or-later"
@ -16,18 +16,20 @@ systemd-units = { enable = false }
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
chrono = "^0.4.26" chrono = "^0.4"
lemmy_api_common = "0.19.1" lemmy_api_common = "0.19.5"
lemmy_db_schema = "0.19.1" lemmy_db_schema = "0.19.5"
once_cell = "^1.18.0" once_cell = "^1.19"
reqwest = { version = "^0.11.18", features = ["blocking", "json"] } reqwest = { version = "^0.12", features = ["blocking", "json"] }
serde = "^1.0.164" serde = "^1.0"
serde_derive = "^1.0.164" serde_derive = "^1.0"
serde_json = "^1.0.97" serde_json = "^1.0"
strum_macros = "^0.25.0" strum_macros = "^0.26"
tokio = { version = "^1.32.0", features = ["rt", "rt-multi-thread", "macros"] } tokio = { version = "^1.37", features = ["rt", "rt-multi-thread", "macros"] }
url = "^2.4.0" url = "^2.5"
confy = "^0.5.1" confy = "^0.6"
toml = "^0.8.8" toml = "^0.8"
systemd-journal-logger = "^2.1.1" systemd-journal-logger = "^2.1.1"
log = "^0.4.20" log = "^0.4"
async-trait = "^0.1"
notify = "6.1.1"

View file

@ -1,271 +1,136 @@
use crate::config::{Config, PostBody, SeriesConfig}; use crate::{config::{Config}, HTTP_CLIENT};
use crate::jnovel::PostInfo; use crate::lemmy::{Lemmy};
use crate::lemmy::Lemmy; use crate::post_history::{SeriesHistory};
use crate::post_history::SeriesHistory;
use crate::{jnovel, lemmy, write_error, write_info, write_warn, SharedData};
use chrono::{DateTime, Duration, Utc}; use chrono::{DateTime, Duration, Utc};
use lemmy_api_common::post::CreatePost; use std::sync::{Arc, RwLock};
use lemmy_db_schema::newtypes::{CommunityId, LanguageId}; use notify::{Event, EventKind, event::{AccessKind, AccessMode}, RecursiveMode, Watcher};
use lemmy_db_schema::PostFeatureType;
use std::collections::HashMap;
use std::sync::Arc;
use tokio::sync::RwLock;
use tokio::time::sleep; use tokio::time::sleep;
use systemd_journal_logger::connected_to_journal;
pub(crate) async fn run(data: Arc<RwLock<SharedData>>) { macro_rules! debug {
let mut last_reload: DateTime<Utc>; ($msg:tt) => {
let mut lemmy: Lemmy; match connected_to_journal() {
let mut login_error: bool; true => log::debug!("[DEBUG] {}", $msg),
let mut communities; false => println!("[DEBUG] {}", $msg),
{
let mut write = data.write().await;
// Errors during bot init are likely unrecoverable and therefore should panic the bot
// Does not really matter since the bot will get restarted anyway but this way the uptime url logs a downtime
write.config = Config::load();
last_reload = Utc::now();
}
{
let read = data.read().await;
lemmy = match lemmy::login(&read.config).await {
Ok(data) => data,
Err(_) => panic!(),
};
login_error = false;
communities = match lemmy.get_communities().await {
Ok(data) => data,
Err(_) => panic!(),
};
}
{
let mut write = data.write().await;
write.start = Utc::now();
}
let info_msg = "Bot init successful, starting normal operations".to_owned();
write_info(info_msg);
loop {
idle(&data).await;
{
let mut write = data.write().await;
write.start = Utc::now();
if write.start - last_reload >= Duration::seconds(write.config.config_reload_seconds as i64) {
write.config = Config::load();
let message = "Config reloaded".to_owned();
write_info(message);
}
}
{
let read = data.read().await;
if login_error {
lemmy = match lemmy::login(&read.config).await {
Ok(data) => data,
Err(_) => continue,
};
login_error = false;
}
}
{
let read = data.read().await;
if read.start - last_reload >= Duration::seconds(read.config.config_reload_seconds as i64) {
communities = match lemmy.get_communities().await {
Ok(data) => data,
Err(_) => {
login_error = true;
continue;
}
};
let message = "Communities reloaded".to_owned();
write_info(message);
last_reload = Utc::now();
}
}
{
let mut write = data.write().await;
write.post_history = SeriesHistory::load_history();
}
{
let read = data.read().await;
let series = read.config.series.clone();
drop(read);
for series in series {
if handle_series(&series, &communities, &lemmy, &data)
.await
.is_err()
{
login_error = true;
continue;
};
}
}
idle(&data).await;
}
}
async fn idle(data: &Arc<RwLock<SharedData>>) {
let read = data.read().await;
let mut sleep_duration = Duration::seconds(30);
if Utc::now() - read.start > sleep_duration {
sleep_duration = Duration::seconds(60);
}
if let Some(status_url) = read.config.status_post_url.clone() {
match reqwest::get(status_url).await {
Ok(_) => {}
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
}
} }
}; };
while Utc::now() - read.start < sleep_duration {
sleep(Duration::milliseconds(100).to_std().unwrap()).await;
}
} }
async fn handle_series(series: &SeriesConfig, communities: &HashMap<String, CommunityId>, lemmy: &Lemmy, data: &Arc<RwLock<SharedData>>) -> Result<(), ()> { macro_rules! info {
let mut post_list = match jnovel::check_feed(series.slug.as_str(), series.parted).await { ($msg:tt) => {
Ok(data) => data, match connected_to_journal() {
Err(_) => return Err(()), true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
}; };
}
for (index, post_info) in post_list.clone().iter().enumerate() {
// todo .clone() likely not needed macro_rules! error {
let post_part_info = post_info.get_part_info(); ($msg:tt) => {
let post_lemmy_info = post_info.get_lemmy_info(); match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
{ false => eprintln!("[ERROR] {}", $msg),
let read = data.read().await; }
if read.post_history.check_for_post( };
series.slug.as_str(), }
post_part_info.as_string().as_str(),
post_lemmy_info.title.as_str(), pub(crate) struct Bot {
) { shared_config: Arc<RwLock<Config>>,
continue; history: SeriesHistory,
} run_start_time: DateTime<Utc>
} }
let post_series_config = match post_info { enum Wait {
PostInfo::Chapter { .. } => &series.prepub_community, Absolute,
PostInfo::Volume { .. } => &series.volume_community, Buffer
}; }
let community_id = *communities impl Bot {
.get(post_series_config.name.as_str()) pub(crate) fn new() -> Self {
.expect("Given community is invalid"); let config = Config::load();
let shared_config: Arc<RwLock<Config>> = Arc::new(RwLock::new(config));
let post_body = match &post_series_config.post_body {
PostBody::None => None, let shared_config_copy = shared_config.clone();
PostBody::Description => post_info.get_description(), let mut watcher = notify::recommended_watcher(move |res: Result<Event, notify::Error>| {
PostBody::Custom(text) => Some(text.clone()), match res {
}; Ok(event) => {
if event.kind == EventKind::Access(AccessKind::Close(AccessMode::Write)) {
let post_data = CreatePost { let mut write = shared_config_copy.write().expect("Write Lock Failed");
name: post_lemmy_info.title.clone(), let new_config = Config::load();
community_id, write.series = new_config.series;
url: Some(post_lemmy_info.url), write.instance = new_config.instance;
body: post_body, write.protected_communities = new_config.protected_communities;
honeypot: None, write.status_post_url = new_config.status_post_url;
nsfw: None, info!("Reloaded Configuration");
language_id: Some(LanguageId(37)), // TODO get this id once every few hours per API request, the ordering of IDs suggests that the EN Id might change in the future }
}; },
Err(e) => {
let info = format!( let msg = format!("Error watching files: {e}");
"Posting '{}' to {}", error!(msg);
post_lemmy_info.title.as_str(), }
post_series_config.name.as_str() }
); }).expect("Watcher Error");
write_info(info);
let post_id = lemmy.post(post_data).await?; watcher.watch(&Config::get_path(), RecursiveMode::NonRecursive).expect("Error in watcher");
{ let history: SeriesHistory = SeriesHistory::load_history();
let read = data.read().await;
if post_series_config.pin_settings.pin_new_post_community Bot { shared_config, history, run_start_time: Utc::now() }
&& !read }
.config pub(crate) async fn run(&mut self) {
.protected_communities loop {
.contains(&post_series_config.name) let mut lemmy = match Lemmy::new(&self.shared_config).await {
{ Ok(data) => data,
let info = format!( Err(_) => continue,
"Pinning '{}' to {}", };
post_lemmy_info.title,
post_series_config.name.as_str() lemmy.get_communities().await;
);
write_info(info); self.history = SeriesHistory::load_history();
let pinned_posts = lemmy.get_community_pinned(community_id).await?;
if !pinned_posts.is_empty() { let start: DateTime<Utc> = Utc::now();
let community_pinned_post = &pinned_posts[0]; while Utc::now() - start <= Duration::minutes(60) {
lemmy self.run_start_time = Utc::now();
.unpin(community_pinned_post.post.id, PostFeatureType::Community) self.ping_status().await;
.await?; let read_copy = self.shared_config.read().expect("Read Lock Failed").clone();
} for series in read_copy.series {
lemmy.pin(post_id, PostFeatureType::Community).await?; series.update(&mut self.history, &lemmy, &self.shared_config).await;
} else if read debug!("Done Updating Series");
.config self.wait(1, Wait::Absolute).await;
.protected_communities }
.contains(&post_series_config.name) debug!("Awaiting Timeout");
{ self.wait(30, Wait::Buffer).await;
let message = format!( debug!("Pinging Server");
"Community '{}' for Series '{}' is protected. Is this intended?", self.ping_status().await;
&post_series_config.name, series.slug debug!("Awaiting Timeout 2");
); self.wait(30, Wait::Absolute).await;
write_warn(message); }
}
} lemmy.logout().await;
}
let read = data.read().await; }
if post_series_config.pin_settings.pin_new_post_local {
let info = format!("Pinning '{}' to Instance", post_lemmy_info.title); async fn ping_status(&self) {
write_info(info); let read_config = &self.shared_config.read().expect("Read Lock Failed").clone();
let pinned_posts = lemmy.get_local_pinned().await?; if let Some(status_url) = &read_config.status_post_url {
if !pinned_posts.is_empty() { match HTTP_CLIENT.get(status_url).send().await {
for pinned_post in pinned_posts { Ok(_) => {},
if read Err(e) => {
.config let err_msg = format!("While pinging status URL: {e}");
.protected_communities error!(err_msg);
.contains(&pinned_post.community.name) }
{ }
continue; }
} else { }
let community_pinned_post = &pinned_post;
lemmy async fn wait(&self, seconds: i64, start_time: Wait) {
.unpin(community_pinned_post.post.id, PostFeatureType::Local) let duration: Duration = Duration::seconds(seconds);
.await?; let start_time: DateTime<Utc> = match start_time {
break; Wait::Absolute => Utc::now(),
} Wait::Buffer => self.run_start_time,
} };
} while Utc::now() - start_time < duration {
lemmy.pin(post_id, PostFeatureType::Local).await?; sleep(Duration::milliseconds(100).to_std().unwrap()).await
} }
}
let mut series_history = read.post_history.get_series(series.slug.as_str());
let mut part_history = series_history.get_part(post_part_info.as_string().as_str());
drop(read);
match post_info {
PostInfo::Chapter { .. } => part_history.chapter = post_info.get_lemmy_info().title,
PostInfo::Volume { .. } => part_history.volume = post_info.get_lemmy_info().title,
}
series_history.set_part(post_part_info.as_string().as_str(), part_history);
let mut write = data.write().await;
write
.post_history
.set_series(series.slug.as_str(), series_history);
write.post_history.save_history();
}
Ok(())
} }

View file

@ -1,12 +1,57 @@
use std::path::PathBuf;
use std::sync::{Arc, RwLock};
use chrono::{Timelike, Utc};
use crate::config::PostBody::Description; use crate::config::PostBody::Description;
use lemmy_api_common::sensitive::Sensitive; use lemmy_db_schema::PostFeatureType;
use lemmy_db_schema::sensitive::SensitiveString;
use serde_derive::{Deserialize, Serialize}; use serde_derive::{Deserialize, Serialize};
use crate::lemmy::{Lemmy, PartInfo, PostType};
use crate::post_history::{SeriesHistory};
use systemd_journal_logger::connected_to_journal;
use crate::fetchers::{FetcherTrait, Fetcher};
use crate::fetchers::jnovel::{JNovelFetcher};
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
macro_rules! warn {
($msg:tt) => {
match connected_to_journal() {
true => log::warn!("[WARN] {}", $msg),
false => println!("[WARN] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
#[derive(Serialize, Deserialize, Clone, Debug)] #[derive(Serialize, Deserialize, Clone, Debug)]
pub(crate) struct Config { pub(crate) struct Config {
pub(crate) instance: String, pub(crate) instance: String,
username: String, username: SensitiveString,
password: String, password: SensitiveString,
pub(crate) status_post_url: Option<String>, pub(crate) status_post_url: Option<String>,
pub(crate) config_reload_seconds: u32, pub(crate) config_reload_seconds: u32,
pub(crate) protected_communities: Vec<String>, pub(crate) protected_communities: Vec<String>,
@ -40,12 +85,16 @@ impl Config {
cfg cfg
} }
pub(crate) fn get_username(&self) -> Sensitive<String> { pub(crate) fn get_path() -> PathBuf {
Sensitive::new(self.username.clone()) confy::get_configuration_file_path(env!("CARGO_PKG_NAME"), "config").expect("Application will not without confy")
} }
pub(crate) fn get_password(&self) -> Sensitive<String> { pub(crate) fn get_username(&self) -> SensitiveString {
Sensitive::new(self.password.clone()) self.username.clone()
}
pub(crate) fn get_password(&self) -> SensitiveString {
self.password.clone()
} }
} }
@ -53,8 +102,8 @@ impl Default for Config {
fn default() -> Self { fn default() -> Self {
Config { Config {
instance: "".to_owned(), instance: "".to_owned(),
username: "".to_owned(), username: SensitiveString::from("".to_owned()),
password: "".to_owned(), password: SensitiveString::from("".to_owned()),
status_post_url: None, status_post_url: None,
config_reload_seconds: 21600, config_reload_seconds: 21600,
protected_communities: vec![], protected_communities: vec![],
@ -69,6 +118,162 @@ pub(crate) struct SeriesConfig {
pub(crate) parted: bool, pub(crate) parted: bool,
pub(crate) prepub_community: PostConfig, pub(crate) prepub_community: PostConfig,
pub(crate) volume_community: PostConfig, pub(crate) volume_community: PostConfig,
pub(crate) fetcher: Fetcher
}
impl SeriesConfig {
pub(crate) async fn update(&self, history: &mut SeriesHistory, lemmy: &Lemmy, config: &Arc<RwLock<Config>>) {
let info_msg = format!("Checking {} for Updates", self.slug);
info!(info_msg);
let mut fetcher: Fetcher = match &self.fetcher {
Fetcher::Jnc(_) => {
Fetcher::Jnc(JNovelFetcher::new())
},
/*default => {
let err_msg = format!("Fetcher {default} not implemented");
error!(err_msg);
return;
}*/
};
match fetcher {
Fetcher::Jnc(ref mut jnc) => {
jnc.set_series(self.slug.clone());
jnc.set_part_option(self.parted);
}
}
let post_list = match fetcher.check_feed().await {
Ok(data) => data,
Err(_) => {
let err_msg = format!("While checking feed for {}", self.slug);
error!(err_msg);
return;
}
};
if post_list.is_empty() && Utc::now().minute() % 10 == 0 {
let info_msg = "No Updates found";
info!(info_msg);
}
for post_info in post_list.iter() {
if history.check_for_post(
self.slug.as_str(),
post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str(),
post_info.get_info().title.as_str()
) {
continue
}
let post_data = post_info.get_post_data(self, lemmy);
let info = format!(
"Posting '{}' to {}",
post_info.get_info().title.as_str(),
post_info.get_post_config(self).name.as_str()
);
info!(info);
let post_id = match lemmy.post(post_data).await {
Some(data) => data,
None=> {
error!("Error posting chapter");
return;
}
};
let read_config = config.read().expect("Read Lock Failed").clone();
if post_info.get_post_config(self).pin_settings.pin_new_post_community
&& !read_config
.protected_communities
.contains(&post_info.get_post_config(self).name)
{
let info = format!(
"Pinning '{}' to {}",
post_info.get_info().title,
post_info.get_post_config(self).name.as_str()
);
info!(info);
let pinned_posts = lemmy.get_community_pinned(lemmy.get_community_id(&post_info.get_post_config(self).name)).await.unwrap_or_else(|| {
error!("Pinning of Post to community failed");
vec![]
});
if !pinned_posts.is_empty() {
let community_pinned_post = &pinned_posts[0];
if lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Community).await.is_none() {
error!("Error un-pinning post");
}
}
if lemmy.pin(post_id, PostFeatureType::Community).await.is_none() {
error!("Error pinning post");
}
} else if read_config
.protected_communities
.contains(&post_info.get_post_config(self).name)
{
let message = format!(
"Community '{}' for Series '{}' is protected. Is this intended?",
&post_info.get_post_config(self).name, self.slug
);
warn!(message);
}
if post_info.get_post_config(self).pin_settings.pin_new_post_local {
let info = format!("Pinning '{}' to Instance", post_info.get_info().title);
info!(info);
let pinned_posts = match lemmy.get_local_pinned().await {
Some(data) => {data}
None => {
error!("Error fetching pinned posts");
vec![]
}
};
if !pinned_posts.is_empty() {
for pinned_post in pinned_posts {
if read_config
.protected_communities
.contains(&pinned_post.community.name)
{
continue;
} else {
let community_pinned_post = &pinned_post;
if lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Local).await.is_none() {
error!("Error pinning post");
continue;
}
break;
}
}
}
if lemmy.pin(post_id, PostFeatureType::Local).await.is_none() {
error!("Error pinning post");
};
}
let mut series_history = history.get_series(self.slug.as_str());
let mut part_history = series_history.get_part(post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str());
match post_info.post_type {
Some(post_type) => {
match post_type {
PostType::Chapter => part_history.chapter = post_info.get_info().title,
PostType::Volume => part_history.volume = post_info.get_info().title,
}
}
None => part_history.chapter = post_info.get_info().title,
}
series_history.set_part(post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str(), part_history);
history
.set_series(self.slug.as_str(), series_history);
debug!("Saving History");
history.save_history();
}
}
} }
#[derive(Debug, Serialize, Deserialize, Clone)] #[derive(Debug, Serialize, Deserialize, Clone)]

307
src/fetchers/jnovel.rs Normal file
View file

@ -0,0 +1,307 @@
use crate::{HTTP_CLIENT};
use chrono::{DateTime, Duration, Utc};
use serde_derive::{Deserialize, Serialize};
use std::collections::HashMap;
use std::ops::Sub;
use async_trait::async_trait;
use crate::fetchers::{FetcherTrait};
use crate::lemmy::{PartInfo, PostInfo, PostInfoInner, PostType};
use systemd_journal_logger::connected_to_journal;
use crate::lemmy::PartInfo::{NoParts, Part};
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
static PAST_DAYS_ELIGIBLE: u8 = 4;
macro_rules! api_url {
() => {
"https://labs.j-novel.club/app/v2".to_owned()
};
}
macro_rules! jnc_base_url {
() => {
"https://j-novel.club".to_owned()
};
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct VolumesWrapper {
volumes: Vec<VolumeDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct ChapterWrapper {
parts: Vec<ChapterDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct PaginationInfo {
limit: usize,
skip: usize,
#[serde(alias = "lastPage")]
last_page: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct Cover {
#[serde(alias = "coverUrl")]
pub(crate) cover: String,
#[serde(alias = "thumbnailUrl")]
pub(crate) thumbnail: String,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct VolumeDetail {
pub(crate) title: String,
pub(crate) slug: String,
number: u8,
publishing: String,
#[serde(alias = "shortDescription")]
short_description: String,
cover: Cover,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct ChapterDetail {
pub(crate) title: String,
pub(crate) slug: String,
launch: String,
pub(crate) cover: Option<Cover>,
}
#[derive(Deserialize, Serialize, Debug, Clone)]
pub(crate) struct JNovelFetcher {
series_slug: String,
series_has_parts: bool
}
impl Default for JNovelFetcher {
fn default() -> Self {
Self {
series_slug: "".to_owned(),
series_has_parts: false,
}
}
}
impl JNovelFetcher {
pub(crate) fn set_series(&mut self, series: String) {
self.series_slug = series;
}
pub(crate) fn set_part_option(&mut self, has_parts: bool) {
self.series_has_parts = has_parts;
}
}
#[async_trait]
impl FetcherTrait for JNovelFetcher {
fn new() -> Self {
JNovelFetcher {
series_slug: "".to_owned(),
series_has_parts: false
}
}
async fn check_feed(&self) -> Result<Vec<PostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/series/" + self.series_slug.as_str() + "/volumes?format=json")
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("While checking feed: {e}");
error!(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
}
};
let mut volume_brief_data: VolumesWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
}
};
volume_brief_data.volumes.reverse(); // Makes breaking out of the volume loop easier
// If no parts just use 0 as Part indicator as no Series with Parts has a Part 0
let mut volume_map: HashMap<u8, PostInfo> = HashMap::new();
let mut prepub_map: HashMap<u8, PostInfo> = HashMap::new();
for volume in volume_brief_data.volumes.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&volume.publishing).unwrap();
if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
match self.series_has_parts {
true => continue,
false => break,
}
}
let new_part_info: PartInfo;
if self.series_has_parts {
let mut part_number: Option<u8> = None;
let splits: Vec<&str> = volume.slug.split('-').collect();
for (index, split) in splits.clone().into_iter().enumerate() {
if split == "part" {
part_number = Some(
splits[index + 1]
.parse::<u8>()
.expect("Split Element after 'Part' should always be a number"),
);
break;
}
}
match part_number {
Some(number) => new_part_info = Part(number),
None => {
info!("No Part found, assuming 1");
new_part_info = Part(1);
}
}
} else {
new_part_info = NoParts;
}
let post_url = format!(
"{}/series/{}#volume-{}",
jnc_base_url!(),
self.series_slug.as_str(),
volume.number
);
let post_details = PostInfoInner {
title: volume.title.clone(),
url: post_url.clone(),
thumbnail: Some(volume.cover.thumbnail.clone())
};
let new_post_info = PostInfo {
post_type: Some(PostType::Volume),
part: Some(new_part_info),
description: Some(volume.short_description.clone()),
lemmy_info: post_details,
};
let part_id = new_part_info.as_u8();
if publishing_date <= Utc::now() {
volume_map
.entry(part_id)
.and_modify(|val| {
if *val < new_post_info {
*val = new_post_info.clone()
}
})
.or_insert(new_post_info);
}
if let Some(prepub_info) = get_latest_prepub(&volume.slug).await {
let prepub_post_info = PostInfo {
post_type: Some(PostType::Chapter),
part: Some(new_part_info),
lemmy_info: prepub_info,
description: None,
};
prepub_map
.entry(part_id)
.and_modify(|val| {
if *val < prepub_post_info {
*val = prepub_post_info.clone()
}
})
.or_insert(prepub_post_info);
}
}
let mut result_vec: Vec<PostInfo> = volume_map.values().cloned().collect();
let mut prepub_vec: Vec<PostInfo> = prepub_map.values().cloned().collect();
result_vec.append(&mut prepub_vec);
Ok(result_vec)
}
}
async fn get_latest_prepub(volume_slug: &str) -> Option<PostInfoInner> {
let response = match HTTP_CLIENT
.get(api_url!() + "/volumes/" + volume_slug + "/parts?format=json")
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("While getting latest PrePub: {e}");
error!(err_msg);
return None;
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return None;
}
};
let mut volume_prepub_parts_data: ChapterWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return None;
}
};
volume_prepub_parts_data.parts.reverse(); // Makes breaking out of the parts loop easier
let mut post_details: Option<PostInfoInner> = None;
for prepub_part in volume_prepub_parts_data.parts.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&prepub_part.launch).unwrap();
if publishing_date > Utc::now() {
break;
} else if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
continue;
}
let thumbnail = prepub_part.cover.as_ref().map(|cover| cover.thumbnail.clone());
let post_url = format!("{}/read/{}", jnc_base_url!(), prepub_part.slug);
post_details = Some(PostInfoInner {
title: prepub_part.title.clone(),
url: post_url.clone(),
thumbnail
});
}
post_details
}

33
src/fetchers/mod.rs Normal file
View file

@ -0,0 +1,33 @@
use async_trait::async_trait;
use serde_derive::{Deserialize, Serialize};
use strum_macros::Display;
use crate::fetchers::Fetcher::Jnc;
use crate::fetchers::jnovel::JNovelFetcher;
use crate::lemmy::{PostInfo};
pub mod jnovel;
#[async_trait]
pub(crate) trait FetcherTrait {
fn new() -> Self where Self: Sized;
async fn check_feed(&self) -> Result<Vec<PostInfo>, ()>;
}
impl Fetcher {
pub(crate) async fn check_feed(&self) -> Result<Vec<PostInfo>, ()> {
match self {
Jnc(fetcher) => fetcher.check_feed().await,
/*default => {
let err_msg = format!("Fetcher {default} is not implemented");
error!(err_msg);
Err(())
}*/
}
}
}
#[derive(Deserialize, Serialize, Debug, Clone, Display)]
pub(crate) enum Fetcher {
#[serde(rename = "jnc")]
Jnc(#[serde(skip)] JNovelFetcher)
}

View file

@ -1,415 +0,0 @@
use crate::jnovel::PartInfo::{NoParts, Part};
use crate::jnovel::PostInfo::{Chapter, Volume};
use crate::{write_error, HTTP_CLIENT};
use chrono::{DateTime, Duration, Utc};
use serde_derive::{Deserialize, Serialize};
use std::cmp::Ordering;
use std::collections::HashMap;
use std::ops::Sub;
use url::Url;
static PAST_DAYS_ELIGIBLE: u8 = 4;
macro_rules! api_url {
() => {
"https://labs.j-novel.club/app/v1".to_owned()
};
}
macro_rules! jnc_base_url {
() => {
"https://j-novel.club".to_owned()
};
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct VolumesWrapper {
volumes: Vec<VolumeDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct ChapterWrapper {
parts: Vec<ChapterDetail>,
pagination: PaginationInfo,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
struct PaginationInfo {
limit: usize,
skip: usize,
#[serde(alias = "lastPage")]
last_page: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct Cover {
#[serde(alias = "coverUrl")]
pub(crate) cover: String,
#[serde(alias = "thumbnailUrl")]
pub(crate) thumbnail: String,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct VolumeDetail {
pub(crate) title: String,
pub(crate) slug: String,
pub(crate) number: u8,
pub(crate) publishing: String,
#[serde(alias = "shortDescription")]
pub(crate) short_description: String,
pub(crate) cover: Cover,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
pub(crate) struct ChapterDetail {
pub(crate) title: String,
pub(crate) slug: String,
pub(crate) launch: String,
pub(crate) cover: Option<Cover>,
}
#[derive(Debug, Clone)]
pub(crate) struct LemmyPostInfo {
pub(crate) title: String,
pub(crate) url: Url,
}
#[derive(Debug, Copy, Clone)]
pub(crate) enum PartInfo {
NoParts,
Part(u8),
}
impl PartInfo {
pub(crate) fn as_u8(&self) -> u8 {
match self {
Part(number) => *number,
NoParts => 0,
}
}
pub(crate) fn as_string(&self) -> String {
self.as_u8().to_string()
}
}
impl PartialEq for PartInfo {
fn eq(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric == other_numeric
}
}
impl PartialOrd for PartInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric < other_numeric
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric > other_numeric
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
#[derive(Debug, Clone)]
pub(crate) enum PostInfo {
Chapter {
part: PartInfo,
lemmy_info: LemmyPostInfo,
},
Volume {
part: PartInfo,
description: String,
lemmy_info: LemmyPostInfo,
},
}
impl PostInfo {
pub(crate) fn get_part_info(&self) -> PartInfo {
match self {
Chapter {
part: part_info, ..
} => *part_info,
Volume {
part: part_info, ..
} => *part_info,
}
}
pub(crate) fn get_lemmy_info(&self) -> LemmyPostInfo {
match self {
Chapter { lemmy_info, .. } => lemmy_info.clone(),
Volume { lemmy_info, .. } => lemmy_info.clone(),
}
}
pub(crate) fn get_description(&self) -> Option<String> {
match self {
Chapter { .. } => None,
Volume { description, .. } => Some(description.clone()),
}
}
}
impl PartialEq for PostInfo {
fn eq(&self, other: &Self) -> bool {
let self_part = match self {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
let other_part = match other {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
self_part.eq(other_part)
}
}
impl PartialOrd for PostInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_part = match self {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
let other_part = match other {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
self_part < other_part
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_part = match self {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
let other_part = match other {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
self_part > other_part
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
pub(crate) async fn check_feed(series_slug: &str, series_has_parts: bool) -> Result<Vec<PostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/series/" + series_slug + "/volumes?format=json")
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
let mut volume_brief_data: VolumesWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
volume_brief_data.volumes.reverse(); // Makes breaking out of the volume loop easier
// If no parts just use 0 as Part indicator as no Series with Parts has a Part 0
let mut volume_map: HashMap<u8, PostInfo> = HashMap::new();
let mut prepub_map: HashMap<u8, PostInfo> = HashMap::new();
for volume in volume_brief_data.volumes.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&volume.publishing).unwrap();
if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
match series_has_parts {
true => continue,
false => break,
}
}
let new_part_info: PartInfo;
if series_has_parts {
let mut part_number: Option<u8> = None;
let splits: Vec<&str> = volume.slug.split('-').collect();
for (index, split) in splits.clone().into_iter().enumerate() {
if split == "part" {
part_number = Some(
splits[index + 1]
.parse::<u8>()
.expect("Split Element after 'Part' should always be a number"),
);
break;
}
}
match part_number {
Some(number) => new_part_info = Part(number),
None => {
println!("No Part found, assuming 1");
new_part_info = Part(1);
}
}
} else {
new_part_info = NoParts;
}
let post_url = format!(
"{}/series/{series_slug}#volume-{}",
jnc_base_url!(),
volume.number
);
let post_details = LemmyPostInfo {
title: volume.title.clone(),
url: Url::parse(&post_url).unwrap(),
};
let new_post_info = Volume {
part: new_part_info,
description: volume.short_description.clone(),
lemmy_info: post_details,
};
let part_id = new_part_info.as_u8();
if publishing_date <= Utc::now() {
volume_map
.entry(part_id)
.and_modify(|val| {
if *val < new_post_info {
*val = new_post_info.clone()
}
})
.or_insert(new_post_info);
}
if let Some(prepub_info) = get_latest_prepub(&volume.slug).await? {
let prepub_post_info = Chapter {
part: new_part_info,
lemmy_info: prepub_info,
};
prepub_map
.entry(part_id)
.and_modify(|val| {
if *val < prepub_post_info {
*val = prepub_post_info.clone()
}
})
.or_insert(prepub_post_info);
}
}
let mut result_vec: Vec<PostInfo> = volume_map.values().cloned().collect();
let mut prepub_vec: Vec<PostInfo> = prepub_map.values().cloned().collect();
result_vec.append(&mut prepub_vec);
Ok(result_vec)
}
async fn get_latest_prepub(volume_slug: &str) -> Result<Option<LemmyPostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/volumes/" + volume_slug + "/parts?format=json")
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
let mut volume_prepub_parts_data: ChapterWrapper = match serde_json::from_str(&response) {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
volume_prepub_parts_data.parts.reverse(); // Makes breaking out of the parts loop easier
let mut post_details: Option<LemmyPostInfo> = None;
for prepub_part in volume_prepub_parts_data.parts.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&prepub_part.launch).unwrap();
if publishing_date > Utc::now() {
break;
} else if publishing_date < Utc::now().sub(Duration::days(PAST_DAYS_ELIGIBLE as i64)) {
continue;
}
let post_url = format!("{}/read/{}", jnc_base_url!(), prepub_part.slug);
post_details = Some(LemmyPostInfo {
title: prepub_part.title.clone(),
url: Url::parse(&post_url).unwrap(),
});
}
Ok(post_details)
}

View file

@ -1,139 +1,295 @@
use crate::config::Config; use std::cmp::Ordering;
use crate::{write_error, HTTP_CLIENT}; use crate::config::{Config, PostBody, PostConfig, SeriesConfig};
use crate::{HTTP_CLIENT};
use lemmy_api_common::community::{ListCommunities, ListCommunitiesResponse}; use lemmy_api_common::community::{ListCommunities, ListCommunitiesResponse};
use lemmy_api_common::lemmy_db_views::structs::PostView; use lemmy_api_common::lemmy_db_views::structs::PostView;
use lemmy_api_common::person::{Login, LoginResponse}; use lemmy_api_common::person::{Login, LoginResponse};
use lemmy_api_common::post::{CreatePost, FeaturePost, GetPosts, GetPostsResponse}; use lemmy_api_common::post::{CreatePost, FeaturePost, GetPosts, GetPostsResponse};
use lemmy_api_common::sensitive::Sensitive; use lemmy_db_schema::newtypes::{CommunityId, LanguageId, PostId};
use lemmy_db_schema::newtypes::{CommunityId, PostId};
use lemmy_db_schema::{ListingType, PostFeatureType}; use lemmy_db_schema::{ListingType, PostFeatureType};
use reqwest::StatusCode; use reqwest::StatusCode;
use std::collections::HashMap; use std::collections::HashMap;
use std::sync::{RwLock};
use lemmy_db_schema::sensitive::SensitiveString;
use serde::{Deserialize, Serialize};
use systemd_journal_logger::connected_to_journal;
pub(crate) struct Lemmy { macro_rules! debug {
jwt_token: Sensitive<String>, ($msg:tt) => {
instance: String, match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
} }
pub(crate) async fn login(config: &Config) -> Result<Lemmy, ()> { macro_rules! error {
let login_params = Login { ($msg:tt) => {
username_or_email: config.get_username(), match connected_to_journal() {
password: config.get_password(), true => log::error!("[ERROR] {}", $msg),
totp_2fa_token: None, false => eprintln!("[ERROR] {}", $msg),
};
let response = match HTTP_CLIENT
.post(config.instance.to_owned() + "/api/v3/user/login")
.json(&login_params)
.send()
.await
{
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
} }
}; };
}
match response.status() { pub(crate) struct Lemmy {
StatusCode::OK => { jwt_token: SensitiveString,
let data: LoginResponse = response instance: String,
.json() communities: HashMap<String, CommunityId>,
.await }
.expect("Successful Login Request should return JSON");
match data.jwt {
Some(token) => Ok(Lemmy { #[derive(Debug, Clone)]
jwt_token: token.clone(), pub(crate) struct PostInfoInner {
instance: config.instance.to_owned(), pub(crate) title: String,
}), pub(crate) url: String,
None => { pub(crate) thumbnail: Option<String>
let err_msg = "Login did not return JWT token. Are the credentials valid?".to_owned(); }
write_error(err_msg);
Err(()) #[derive(Debug, Copy, Clone)]
pub(crate) enum PartInfo {
NoParts,
Part(u8),
}
impl PartInfo {
pub(crate) fn as_u8(&self) -> u8 {
match self {
PartInfo::Part(number) => *number,
PartInfo::NoParts => 0,
}
}
pub(crate) fn as_string(&self) -> String {
self.as_u8().to_string()
}
}
impl PartialEq for PartInfo {
fn eq(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric == other_numeric
}
}
impl PartialOrd for PartInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric < other_numeric
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric > other_numeric
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
#[derive(Debug, Clone, Copy)]
pub(crate) enum PostType {
Chapter,
Volume
}
#[derive(Debug, Clone)]
pub(crate) struct PostInfo {
pub(crate) part: Option<PartInfo>,
pub(crate) lemmy_info: PostInfoInner,
pub(crate) description: Option<String>,
pub(crate) post_type: Option<PostType>
}
impl PostInfo {
pub(crate)fn get_info(&self) -> PostInfoInner {
self.lemmy_info.clone()
}
pub(crate)fn get_description(&self) -> Option<String> {
self.description.clone()
}
pub(crate) fn get_part_info(&self) -> Option<PartInfo> {
self.part
}
pub(crate) fn get_post_config(&self, series: &SeriesConfig) -> PostConfig {
match self.post_type {
Some(post_type) => {
match post_type {
PostType::Chapter => series.prepub_community.clone(),
PostType::Volume => series.volume_community.clone(),
} }
} }
None => series.prepub_community.clone(),
} }
status => { }
let err_msg = format!("Unexpected HTTP Status '{}' during Login", status);
write_error(err_msg); pub(crate) fn get_post_data(&self, series: &SeriesConfig, lemmy: &Lemmy) -> CreatePost {
Err(()) let post_config = self.get_post_config(series);
let post_body = match &post_config.post_body {
PostBody::None => None,
PostBody::Description => self.get_description(),
PostBody::Custom(text) => Some(text.clone()),
};
let community_id: CommunityId = lemmy.get_community_id(&post_config.name);
CreatePost {
name: self.get_info().title.clone(),
community_id,
url: Some(self.get_info().url),
custom_thumbnail: self.get_info().thumbnail,
body: post_body,
alt_text: None,
honeypot: None,
nsfw: None,
language_id: Some(LanguageId(37)), // TODO get this id once every few hours per API request, the ordering of IDs suggests that the EN Id might change in the future
} }
} }
} }
impl PartialEq for PostInfo {
fn eq(&self, other: &Self) -> bool {
self.part.eq(&other.part)
}
}
impl PartialOrd for PostInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
self.part < other.part
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
self.part > other.part
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
impl Lemmy { impl Lemmy {
pub(crate) async fn post(&self, post: CreatePost) -> Result<PostId, ()> { pub(crate) fn get_community_id(&self, name: &str) -> CommunityId {
*self.communities.get(name).expect("Given community is invalid")
}
pub(crate) async fn new(config: &RwLock<Config>) -> Result<Self, ()> {
let read_config = config.read().expect("Read Lock Failed").clone();
let login_params = Login {
username_or_email: read_config.get_username(),
password: read_config.get_password(),
totp_2fa_token: None,
};
let response = match HTTP_CLIENT let response = match HTTP_CLIENT
.post(format!("{}/api/v3/post", &self.instance)) .post(read_config.instance.to_owned() + "/api/v3/user/login")
.bearer_auth(&self.jwt_token.to_string()) .json(&login_params)
.json(&post)
.send() .send()
.await .await
{ {
Ok(data) => match data.text().await { Ok(data) => data,
Ok(data) => data, Err(e) => {
Err(e) => { let err_msg = format!("{e}");
let err_msg = format!("{e}"); error!(err_msg);
write_error(err_msg); return Err(());
return Err(()); }
};
match response.status() {
StatusCode::OK => {
let data: LoginResponse = response
.json()
.await
.expect("Successful Login Request should return JSON");
match data.jwt {
Some(token) => Ok(Lemmy {
jwt_token: token.clone(),
instance: read_config.instance.to_owned(),
communities: HashMap::new(),
}),
None => {
let err_msg = "Login did not return JWT token. Are the credentials valid?".to_owned();
error!(err_msg);
Err(())
}
} }
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
} }
}; status => {
let err_msg = format!("Unexpected HTTP Status '{}' during Login", status);
let json_data = match serde_json::from_str::<HashMap<&str, PostView>>(&response) { error!(err_msg);
Ok(mut data) => data.remove("post_view").expect("Element should be present"), Err(())
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
} }
}; }
Ok(json_data.post.id)
} }
async fn feature(&self, params: FeaturePost) -> Result<PostView, ()> { pub(crate) async fn logout(&self) {
let response = match HTTP_CLIENT let _ = self.post_data_json("/api/v3/user/logout", &"").await;
.post(format!("{}/api/v3/post/feature", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.json(&params)
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
let json_data = match serde_json::from_str::<HashMap<&str, PostView>>(&response) {
Ok(mut data) => data.remove("post_view").expect("Element should be present"),
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
Ok(json_data)
} }
pub(crate) async fn unpin(&self, post_id: PostId, location: PostFeatureType) -> Result<PostView, ()> {
pub(crate) async fn post(&self, post: CreatePost) -> Option<PostId> {
let response: String = match self.post_data_json("/api/v3/post", &post).await {
Some(data) => data,
None => return None,
};
let json_data: PostView = match self.parse_json_map(&response).await {
Some(data) => data,
None => return None,
};
Some(json_data.post.id)
}
async fn feature(&self, params: FeaturePost) -> Option<PostView> {
let response: String = match self.post_data_json("/api/v3/post/feature", &params).await {
Some(data) => data,
None => return None,
};
let json_data: PostView = match self.parse_json_map(&response).await {
Some(data) => data,
None => return None,
};
Some(json_data)
}
pub(crate) async fn unpin(&self, post_id: PostId, location: PostFeatureType) -> Option<PostView> {
let pin_params = FeaturePost { let pin_params = FeaturePost {
post_id, post_id,
featured: false, featured: false,
@ -142,7 +298,7 @@ impl Lemmy {
self.feature(pin_params).await self.feature(pin_params).await
} }
pub(crate) async fn pin(&self, post_id: PostId, location: PostFeatureType) -> Result<PostView, ()> { pub(crate) async fn pin(&self, post_id: PostId, location: PostFeatureType) -> Option<PostView> {
let pin_params = FeaturePost { let pin_params = FeaturePost {
post_id, post_id,
featured: true, featured: true,
@ -151,45 +307,23 @@ impl Lemmy {
self.feature(pin_params).await self.feature(pin_params).await
} }
pub(crate) async fn get_community_pinned(&self, community: CommunityId) -> Result<Vec<PostView>, ()> { pub(crate) async fn get_community_pinned(&self, community: CommunityId) -> Option<Vec<PostView>> {
let list_params = GetPosts { let list_params = GetPosts {
community_id: Some(community), community_id: Some(community),
type_: Some(ListingType::Local), type_: Some(ListingType::Local),
..Default::default() ..Default::default()
}; };
let response = match HTTP_CLIENT let response: String = match self.get_data_query("/api/v3/post/list", &list_params).await {
.get(format!("{}/api/v3/post/list", &self.instance)) Some(data) => data,
.bearer_auth(&self.jwt_token.to_string()) None => return None,
.query(&list_params) };
.send() let json_data: GetPostsResponse = match self.parse_json(&response).await {
.await Some(data) => data,
{ None => return None,
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
}; };
let json_data: GetPostsResponse = match serde_json::from_str(&response) { Some(json_data
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
Ok(json_data
.posts .posts
.iter() .iter()
.filter(|post| post.post.featured_community) .filter(|post| post.post.featured_community)
@ -197,44 +331,22 @@ impl Lemmy {
.collect()) .collect())
} }
pub(crate) async fn get_local_pinned(&self) -> Result<Vec<PostView>, ()> { pub(crate) async fn get_local_pinned(&self) -> Option<Vec<PostView>> {
let list_params = GetPosts { let list_params = GetPosts {
type_: Some(ListingType::Local), type_: Some(ListingType::Local),
..Default::default() ..Default::default()
}; };
let response = match HTTP_CLIENT let response: String = match self.get_data_query("/api/v3/post/list", &list_params).await {
.get(format!("{}/api/v3/post/list", &self.instance)) Some(data) => data,
.bearer_auth(&self.jwt_token.to_string()) None => return None,
.query(&list_params) };
.send() let json_data: GetPostsResponse = match self.parse_json(&response).await {
.await Some(data) => data,
{ None => return None,
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
}; };
let json_data: GetPostsResponse = match serde_json::from_str(&response) { Some(json_data
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
};
Ok(json_data
.posts .posts
.iter() .iter()
.filter(|post| post.post.featured_local) .filter(|post| post.post.featured_local)
@ -242,41 +354,19 @@ impl Lemmy {
.collect()) .collect())
} }
pub(crate) async fn get_communities(&self) -> Result<HashMap<String, CommunityId>, ()> { pub(crate) async fn get_communities(&mut self) {
let list_params = ListCommunities { let list_params = ListCommunities {
type_: Some(ListingType::Local), type_: Some(ListingType::Local),
..Default::default() ..Default::default()
}; };
let response = match HTTP_CLIENT let response: String = match self.get_data_query("/api/v3/community/list", &list_params).await {
.get(format!("{}/api/v3/community/list", &self.instance)) Some(data) => data,
.bearer_auth(&self.jwt_token.to_string()) None => return,
.query(&list_params)
.send()
.await
{
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
},
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
}; };
let json_data: ListCommunitiesResponse = match self.parse_json::<ListCommunitiesResponse>(&response).await {
let json_data: ListCommunitiesResponse = match serde_json::from_str(&response) { Some(data) => data,
Ok(data) => data, None => return,
Err(e) => {
let err_msg = format!("{e}");
write_error(err_msg);
return Err(());
}
}; };
let mut communities: HashMap<String, CommunityId> = HashMap::new(); let mut communities: HashMap<String, CommunityId> = HashMap::new();
@ -285,6 +375,76 @@ impl Lemmy {
communities.insert(community.name, community.id); communities.insert(community.name, community.id);
} }
Ok(communities) self.communities = communities;
}
async fn post_data_json<T: Serialize>(&self, route: &str, json: &T ) -> Option<String> {
let res = HTTP_CLIENT
.post(format!("{}{route}", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.json(&json)
.send()
.await;
self.extract_data(res).await
}
async fn get_data_query<T: Serialize>(&self, route: &str, param: &T ) -> Option<String> {
let res = HTTP_CLIENT
.get(format!("{}{route}", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.query(&param)
.send()
.await;
self.extract_data(res).await
}
async fn extract_data(&self, response: Result<reqwest::Response, reqwest::Error>) -> Option<String> {
match response {
Ok(data) => {
if data.status().is_success() {
match data.text().await {
Ok(data) => Some(data),
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
None
}
}
}
else {
let err_msg = format!("HTTP Request failed: {}", data.text().await.unwrap());
error!(err_msg);
None
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
None
}
}
}
async fn parse_json<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Option<T> {
match serde_json::from_str::<T>(response) {
Ok(data) => Some(data),
Err(e) => {
let err_msg = format!("while parsing JSON: {e} ");
error!(err_msg);
None
}
}
}
async fn parse_json_map<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Option<T> {
debug!(response);
match serde_json::from_str::<HashMap<&str, T>>(response) {
Ok(mut data) => Some(data.remove("post_view").expect("Element should be present")),
Err(e) => {
let err_msg = format!("while parsing JSON HashMap: {e}");
error!(err_msg);
None
}
}
} }
} }

View file

@ -1,104 +1,41 @@
use crate::config::Config; use chrono::{Duration};
use crate::post_history::SeriesHistory; use log::{LevelFilter};
use chrono::{DateTime, Duration, Utc};
use log::{error, info, warn, LevelFilter};
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use reqwest::Client; use reqwest::Client;
use std::collections::HashMap; use systemd_journal_logger::{JournalLog};
use std::fmt::Debug; use crate::bot::Bot;
use std::sync::Arc;
use systemd_journal_logger::{connected_to_journal, JournalLog};
use tokio::sync::RwLock;
use tokio::time::sleep;
mod bot; mod bot;
mod config; mod config;
mod jnovel;
mod lemmy; mod lemmy;
mod post_history; mod post_history;
mod fetchers;
pub(crate) fn write_error(err_msg: String) {
match connected_to_journal() {
true => error!("[ERROR] {err_msg}"),
false => println!("[ERROR] {err_msg}"),
}
}
pub(crate) fn write_warn(warn_msg: String) {
match connected_to_journal() {
true => warn!("[WARN] {warn_msg}"),
false => println!("[WARN] {warn_msg}"),
}
}
pub(crate) fn write_info(info_msg: String) {
match connected_to_journal() {
true => info!("[INFO] {info_msg}"),
false => println!("[INFO] {info_msg}"),
}
}
pub static HTTP_CLIENT: Lazy<Client> = Lazy::new(|| { pub static HTTP_CLIENT: Lazy<Client> = Lazy::new(|| {
Client::builder() Client::builder()
.timeout(Duration::seconds(30).to_std().unwrap()) .timeout(Duration::seconds(10).to_std().unwrap())
.connect_timeout(Duration::seconds(30).to_std().unwrap()) .connect_timeout(Duration::seconds(10).to_std().unwrap())
.build() .build()
.expect("build client") .expect("build client")
}); });
#[derive(Clone, Debug)]
pub(crate) struct SharedData {
config: Config,
post_history: SeriesHistory,
start: DateTime<Utc>,
}
impl SharedData {
pub(crate) fn new() -> Self {
SharedData {
config: Config::default(),
post_history: SeriesHistory {
series: HashMap::new(),
},
start: Utc::now(),
}
}
}
#[tokio::main] #[tokio::main]
async fn main() { async fn main() {
JournalLog::new() JournalLog::new()
.expect("Systemd-Logger crate error") .expect("Systemd-Logger crate error")
.install() .install()
.expect("Systemd-Logger crate error"); .expect("Systemd-Logger crate error");
log::set_max_level(LevelFilter::Info); match std::env::var("LOG_LEVEL") {
Ok(level) => {
let mut data = SharedData::new(); match level.as_str() {
"debug" => log::set_max_level(LevelFilter::Debug),
loop { "info" => log::set_max_level(LevelFilter::Info),
let write_data = Arc::new(RwLock::new(data.clone())); _ => log::set_max_level(LevelFilter::Info),
//let read_data = write_data.clone();
let persistent_data = write_data.clone();
let bot_thread = tokio::spawn(async move { bot::run(write_data).await });
let _ = bot_thread.await;
data = persistent_data.read().await.clone();
{
let err_msg = "Bot crashed due to unknown Error, restarting thread after wait...";
match connected_to_journal() {
true => error!("[ERROR] {err_msg}"),
false => println!("[ERROR] {err_msg}"),
} }
} }
_ => log::set_max_level(LevelFilter::Info),
sleep(
Duration::seconds(5)
.to_std()
.expect("Conversion should always work since static"),
)
.await;
} }
let mut bot = Bot::new();
bot.run().await;
} }

View file

@ -1,6 +1,24 @@
use crate::write_error;
use serde_derive::{Deserialize, Serialize}; use serde_derive::{Deserialize, Serialize};
use std::collections::HashMap; use std::collections::HashMap;
use systemd_journal_logger::connected_to_journal;
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
#[derive(Serialize, Deserialize, Default, Clone, Debug)] #[derive(Serialize, Deserialize, Default, Clone, Debug)]
pub(crate) struct SeriesHistory { pub(crate) struct SeriesHistory {
@ -9,6 +27,8 @@ pub(crate) struct SeriesHistory {
impl SeriesHistory { impl SeriesHistory {
pub(crate) fn load_history() -> Self { pub(crate) fn load_history() -> Self {
let info_msg = "Loading History";
info!(info_msg);
match confy::load(env!("CARGO_PKG_NAME"), "history") { match confy::load(env!("CARGO_PKG_NAME"), "history") {
Ok(data) => data, Ok(data) => data,
Err(e) => panic!("history.toml not found: {e}"), Err(e) => panic!("history.toml not found: {e}"),
@ -16,9 +36,11 @@ impl SeriesHistory {
} }
pub(crate) fn save_history(&self) { pub(crate) fn save_history(&self) {
let info_msg = "Saving History";
info!(info_msg);
if let Err(e) = confy::store(env!("CARGO_PKG_NAME"), "history", self) { if let Err(e) = confy::store(env!("CARGO_PKG_NAME"), "history", self) {
let err_msg = format!("Unexpected error saving to history.toml: {e}"); let err_msg = format!("Unexpected error saving to history.toml: {e}");
write_error(err_msg); error!(err_msg);
} }
} }