分类 笔记 下的文章

前言

使用memos调用的相册虽然方便,但是也有一些痛点无法解决

  1. 在CDN被刷流量之后,我已经几乎关闭所有的国内CDN服务,小水管服务器也无法满足大量图片同时加载,那龟速谁用谁知道.
  2. S3存储太贵,在COS被刷了流量之后(没错,我就是这么倒霉),我决定多地备份,主要上传在github,利用Cloudflare+ vercel+github pages+ 其他SaaS. 这些服务的优点就是免费.
  3. 模板来源于网络

Deepseek

主要使用AI来解决主要功能代码,主打一个能用就行.至于有什么bug,一个简单的网页生成 能有什么逆天bug也没关系的....吧
主要代码是 Python

功能

  • 我想要的功能
  1. 上传图片到Github 仓库,触发 Actions 自动生成Html页面
  2. 相册的缩略图需要压缩,点击显示原图,缓解小水管压力(没错,我会定时使用git拉取到国内服务器,主打一个多地都能访问)
  3. 根据图片名称自动生成标题
  • 更新
    2024.12.29
    又找了一个模板,自己觉得还可以
    加了进去
    在workflow里设置需要执行的脚本
    两个脚本 分别为 times.pylens.py 对应着两个模板

演示地址

lens模板 https://photo.asbid.cn
times模板 https://photo.sgcd.net

部署在Github Pages

使用

项目模板

https://github.com/jkjoy/generate-albums

设置

在自己仓库的Settings中找到

指示

TOKEN为你的 Github token

REPO为你想要生成相册的仓库名称 如username/repo

设置

上传规则

相册内容上传到 photos 这个目录下

photos 根目录下的照片默认标题为分享生活

新建文件夹, 该文件夹名称为此目录下所有图片的标题

  • 照片同名txt中的文本为描述说明 最高优先级
    1.jpg 1.txt 则使用1.txt中的文本为描述说明
  • 目录下描述.txt为此目录下所有图片的描述说明 第二优先级
  • 如果两者都没有则使用照片文件名为描述说明

其他部分

可以根据需求修改 template目录下对应模板的index.html文件 中的布局和内容.

每次修改仓库会自动触发Action 生成HTML到目标仓库,目标仓库可以使用Github Pages,也可以部署在Vercel,这里就不多做说明

演示

https://photos-jkjkjoy.vercel.app/

总结

AI真好用!!
https://movie.douban.com/subject/2136204/


[article id="1660"]

后续

如何获取relay中继服务器的列表呢
参考项目 https://github.com/dragonfly-club/dragon-relay
这个项目呢 是把列表生成自定义的html页面,不够灵活,所以我改了一下
用以生成json数据

我原本的设想是通过dockerfile重新构建一个docker镜像,但是一想又嫌麻烦,所以只好通过曲线救国了///

使用方法

python

gen-member-list.py的内容

#!/usr/bin/python3

import logging
import requests
import base64
import json
from collections import Counter
from subprocess import Popen, PIPE
import shutil

outfile = 'output.json'
stats_file = 'stats.json'
USER_AGENT = 'Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/109.0 (https://relay.jiong.us)'
TIMEOUT = 4

instance_ids = set()

def setup_logging():
    logger = logging.getLogger(__name__)
    logger.setLevel(logging.INFO)
    log_handler = logging.FileHandler('gen-member-list.log')
    log_handler.setLevel(logging.INFO)
    log_format = logging.Formatter('%(asctime)s [%(levelname)s] %(message)s')
    log_handler.setFormatter(log_format)
    logger.addHandler(log_handler)
    return logger

logger = setup_logging()

def get_redis_cli_path():
    redis_cli = shutil.which('redis-cli')
    if redis_cli:
        return redis_cli
    else:
        raise FileNotFoundError("redis-cli not found in PATH")

def read_redis_keys():
    redis_cli = get_redis_cli_path()
    cmd = [redis_cli]
    cmdin = 'KEYS relay:subscription:*'.encode('utf-8')
    p = Popen(cmd, stdin=PIPE, stdout=PIPE, stderr=PIPE)
    return p.communicate(input=cmdin)[0].decode('utf-8')

def generate_instance_id(page):
    uid = []
    fields = ['uri', 'email', 'name', 'hcaptchaSiteKey']
    for field in fields:
        try:
            uid.append(str(page.get(field, '')))
        except AttributeError:
            pass

    try:
        if page.get('contact_account'):
            uid.append(str(page['contact_account'].get('id', '')))
            uid.append(str(page['contact_account'].get('username', '')))
    except AttributeError:
        pass

    return '_'.join(filter(None, uid))

def fetch_favicon(domain):
    try:
        favicon_url = f"https://{domain}/favicon.ico"
        response = requests.get(favicon_url, timeout=TIMEOUT)
        if response.status_code == 200:
            return base64.b64encode(response.content).decode('utf-8')
    except Exception as e:
        logger.warning(f"Failed to fetch favicon for {domain}: {str(e)}")
    return ""

def try_nodeinfo(headers, domain, timeout):
    nodeinfo_url = f"https://{domain}/.well-known/nodeinfo"
    response = requests.get(nodeinfo_url, headers=headers, timeout=timeout)
    nodeinfo_link = response.json()['links'][0]['href']

    response = requests.get(nodeinfo_link, headers=headers, timeout=timeout)
    nodeinfo = response.json()

    software = nodeinfo['software']['name']
    version = nodeinfo['software']['version']
    stats = nodeinfo['usage']

    fav_md = fetch_favicon(domain)
    title = nodeinfo["metadata"].get("nodeName", domain.split('.')[0].capitalize())

    uid = generate_instance_id(nodeinfo)

    json_line = {
        'favicon': fav_md,
        'title': title,
        'domain': domain,
        'users': stats['users']['total'],
        'posts': stats.get('localPosts', 0),
        'software': software,
        'version': version,
        'instances': nodeinfo.get('metadata', {}).get('federation', {}).get('domainCount', 0)
    }

    return json_line, uid

def try_mastodon(headers, domain, timeout):
    instance_url = f"https://{domain}/api/v1/instance"
    response = requests.get(instance_url, headers=headers, timeout=timeout)
    instance_info = response.json()

    stats_url = f"https://{domain}/api/v1/instance/peers"
    response = requests.get(stats_url, headers=headers, timeout=timeout)
    peers = response.json()

    fav_md = fetch_favicon(domain)
    title = instance_info['title']
    version = instance_info['version']

    uid = generate_instance_id(instance_info)

    json_line = {
        'favicon': fav_md,
        'title': title,
        'domain': domain,
        'users': instance_info['stats']['user_count'],
        'statuses': instance_info['stats']['status_count'],
        'instances': len(peers),
        'version': version,
        'software': 'mastodon'
    }

    return json_line, uid

def try_misskey(headers, domain, timeout):
    meta_url = f"https://{domain}/api/meta"
    response = requests.post(meta_url, headers=headers, timeout=timeout)
    meta_info = response.json()

    stats_url = f"https://{domain}/api/stats"
    response = requests.post(stats_url, headers=headers, timeout=timeout)
    stats = response.json()

    fav_md = fetch_favicon(domain)
    title = meta_info['name']
    version = meta_info['version']

    uid = generate_instance_id(meta_info)

    json_line = {
        'favicon': fav_md,
        'title': title,
        'domain': domain,
        'users': stats['originalUsersCount'],
        'notes': stats['originalNotesCount'],
        'instances': stats['instances'],
        'version': version,
        'software': 'misskey'
    }

    return json_line, uid

def generate_list():
    json_list = []
    all_domains = [line.split('subscription:')[-1] for line in read_redis_keys().split('\n') if line and 'subscription' in line]
    logger.info(f"Total domains from Redis: {len(all_domains)}")

    success_count = 0
    failure_count = 0
    software_counter = Counter()
    interaction_stats = {}

    for domain in all_domains:
        logger.info(f"Processing domain: {domain}")

        headers = {
            'User-Agent': USER_AGENT
        }

        json_line = {'domain': domain, 'status': 'Stats Unavailable'}
        uid = None
        success = False

        for try_function in [try_mastodon, try_misskey, try_nodeinfo]:
            try:
                json_line, uid = try_function(headers, domain, TIMEOUT)
                logger.info(f"Successfully fetched stats for {domain} using {try_function.__name__}")
                success = True
                break
            except Exception as e:
                logger.warning(f"Failed to fetch stats for {domain} using {try_function.__name__}: {str(e)}")

        if success:
            success_count += 1
            software_counter[json_line.get('software', 'Unknown')] += 1

            interaction_count = json_line.get('statuses', 0) or json_line.get('notes', 0) or json_line.get('posts', 0)
            interaction_stats[domain] = interaction_count

            logger.info(f"Instances count for {domain}: {json_line.get('instances', 0)}")
        else:
            failure_count += 1

        if uid and uid in instance_ids:
            logger.info(f"Skipped duplicate domain {domain} with uid {uid}")
            continue

        if uid:
            instance_ids.add(uid)
        json_list.append(json_line)
        logger.info(f"Added {domain} to the list")

    logger.info(f"Total instances processed: {len(json_list)}")
    logger.info(f"Successful instances: {success_count}")
    logger.info(f"Failed instances: {failure_count}")
    logger.info(f"Software distribution: {dict(software_counter)}")

    json_list.sort(key=lambda x: x.get('users', 0), reverse=True)

    stats = {
        "total_instances": len(json_list),
        "successful_instances": success_count,
        "failed_instances": failure_count,
        "software_distribution": dict(software_counter),
        "interaction_stats": interaction_stats
    }

    with open(stats_file, 'w') as f:
        json.dump(stats, f, indent=2)

    return json_list

if __name__ == "__main__":
    logger.info('Started generating member list.')
    sub_list = generate_list()
    with open(outfile, 'w') as f:
        json.dump(sub_list, f, indent=2)
    logger.info('Write new page template done.')

bash脚本

update-list.sh的内容

#!/bin/sh

if [ ! -f /tmp/setup_done ]; then #如果存在安装缓存则跳过,重启容器会重新安装依赖
    apk add python3 py3-pip py3-requests #安装python
    apk add tzdata #修正时区
    cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime
    echo "Asia/Shanghai" > /etc/timezone
    touch /tmp/setup_done
fi

cd /relay/

./gen-member-list.py || exit 1;

exit 0;

修改Dockercompose.yaml

这一步就是映射本地脚本到docker容器中

services:
  redis:
    restart: always
    image: redis:alpine
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
    volumes:
      - "./redisdata:/data"
      - "./relay:/relay"

  worker:
    container_name: worker
    build: .
    image: yukimochi/activity-relay
    working_dir: /var/lib/relay
    restart: always
    init: true
    command: relay worker
    volumes:
      - "./actor.pem:/var/lib/relay/actor.pem"
      - "./config.yml:/var/lib/relay/config.yml"
    depends_on:
      - redis

  server:
    container_name: relay
    build: .
    image: yukimochi/activity-relay
    working_dir: /var/lib/relay
    restart: always
    init: true
    ports:
      - "8080:8080"
    command: relay server
    volumes:
      - "./actor.pem:/var/lib/relay/actor.pem"
      - "./config.yml:/var/lib/relay/config.yml"
    depends_on:
      - redis

然后把上面两个脚本都丢进relay的文件夹

定时任务

使用宝塔或者系统的计划任务
activity-relay-redis-1为redis的容器名,执行bash

docker exec  activity-relay-redis-1  /bin/sh /relay/update-list.sh

演示

https://relay.jiong.us


前言

最初部署的Activity-Relay服务由于是在宿主机上工作,前段时间由于宿主机机房重启了机器 使Relay服务莫名其妙的不能启动

于是这次使用便携易用的Docker来部署服务,记录一下以备用

具体步骤

拉取仓库

git clone https://github.com/yukimochi/Activity-Relay.git -b v2.0.0

编辑配置

进入Activity-Relay目录

cd Activity-Relay
cp config.yml.example config.yml

编辑config.yml修改相关配置

生成actor RSA 证书

ubuntu使用

openssl genrsa -traditional | tee actor.pem

centos使用

openssl genrsa -out actor.pem 1024 | tee actor.pem

赋予权限600

chmod 600 actor.pem

docker-compose配置

这里需开放端口用以反向代理

services:
  redis:
    restart: always
    image: redis:alpine
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
    volumes:
      - "./redisdata:/data"

  worker:
    container_name: worker
    build: .
    image: yukimochi/activity-relay
    working_dir: /var/lib/relay
    restart: always
    init: true
    command: relay worker
    volumes:
      - "./actor.pem:/var/lib/relay/actor.pem"
      - "./config.yml:/var/lib/relay/config.yml"
    depends_on:
      - redis

  server:
    container_name: relay
    build: .
    image: yukimochi/activity-relay
    working_dir: /var/lib/relay
    restart: always
    init: true
    ports:
      - "8080:8080"
    command: relay server
    volumes:
      - "./actor.pem:/var/lib/relay/actor.pem"
      - "./config.yml:/var/lib/relay/config.yml"
    depends_on:
      - redis

构建镜像与运行服务

docker-compose build
docker-compose up -d

查看容器运行状态

docker-compose ps

停止服务

docker-compose down

反向代理

    location = /inbox {
        proxy_pass http://127.0.0.1:8080; 
        proxy_pass_request_headers on; 
        proxy_set_header Host $http_host; 
    }
    location = /actor {
        proxy_pass http://127.0.0.1:8080; 
        proxy_pass_request_headers on; 
        proxy_set_header Host $http_host; 
    }

完成

https://relay.jiong.us/inbox
加入mastodon 的中继服务即可