Skip to content

Piero Bosio Social Web Site Personale Logo Fediverso

Social Forum federato con il resto del mondo. Non contano le istanze, contano le persone
  • 0 Votes
    1 Posts
    13 Views
    This tutorial will guide you through building a simple ActivityPub bot using Python. The bot will listen for mentions and, when it receives a message in a specific format, it will schedule and send a reminder back to the user after a specified delay. For example, if a user mentions the bot with a message like "@reminder@your.host.com 10m check the oven", the bot will reply 10 minutes later with a message like "🔔 Reminder for @user: check the oven". Prerequisites To follow this tutorial, you will need Python 3.10+ and the following libraries: apkit[server]: A powerful toolkit for building ActivityPub applications in Python. We use the server extra, which includes FastAPI-based components. uvicorn: An ASGI server to run our FastAPI application. cryptography: Used for generating and managing the cryptographic keys required for ActivityPub. uv: An optional but recommended fast package manager. You can install these dependencies using uv or pip. # Initialize a new project with uv uv init # Install dependencies uv add "apkit[server]" uvicorn cryptography Project Structure The project structure is minimal, consisting of a single Python file for our bot's logic. . ├── main.py └── private_key.pem main.py: Contains all the code for the bot. private_key.pem: The private key for the bot's Actor. This will be generated automatically on the first run. Code Walkthrough Our application logic can be broken down into the following steps: Imports and Configuration: Set up necessary imports and basic configuration variables. Key Generation: Prepare the cryptographic keys needed for signing activities. Actor Definition: Define the bot's identity on the Fediverse. Server Initialization: Set up the apkit ActivityPub server. Data Storage: Implement a simple in-memory store for created activities. Reminder Logic: Code the core logic for parsing reminders and sending notifications. Endpoint Definitions: Create the necessary web endpoints (/actor, /inbox, etc.). Activity Handlers: Process incoming activities from other servers. Application Startup: Run the server. Let's dive into each section of the main.py file. 1. Imports and Configuration First, we import the necessary modules and define the basic configuration for our bot. # main.py import asyncio import logging import re import uuid import os from datetime import timedelta, datetime # Imports from FastAPI, cryptography, and apkit from fastapi import Request, Response from fastapi.responses import JSONResponse from cryptography.hazmat.primitives.asymmetric import rsa from cryptography.hazmat.primitives import serialization as crypto_serialization from apkit.config import AppConfig from apkit.server import ActivityPubServer from apkit.server.types import Context, ActorKey from apkit.server.responses import ActivityResponse from apkit.models import ( Actor, Application, CryptographicKey, Follow, Create, Note, Mention, Actor as APKitActor, OrderedCollection, ) from apkit.client import WebfingerResource, WebfingerResult, WebfingerLink from apkit.client.asyncio.client import ActivityPubClient # --- Logging Setup --- logging.basicConfig(level=logging.INFO) logger = logging.getLogger(__name__) # --- Basic Configuration --- HOST = "your.host.com" # Replace with your domain USER_ID = "reminder" # The bot's username Make sure to replace your.host.com with the actual domain where your bot will be hosted. These values determine your bot's unique identifier (e.g., @reminder@your.host.com). 2. Key Generation and Persistence ActivityPub uses HTTP Signatures to secure communication between servers. This requires each actor to have a public/private key pair. The following code generates a private key and saves it to a file if one doesn't already exist. # main.py (continued) # --- Key Persistence --- KEY_FILE = "private_key.pem" # Load the private key if it exists, otherwise generate a new one if os.path.exists(KEY_FILE): logger.info(f"Loading existing private key from {KEY_FILE}.") with open(KEY_FILE, "rb") as f: private_key = crypto_serialization.load_pem_private_key(f.read(), password=None) else: logger.info(f"No key file found. Generating new private key and saving to {KEY_FILE}.") private_key = rsa.generate_private_key(public_exponent=65537, key_size=2048) with open(KEY_FILE, "wb") as f: f.write(private_key.private_bytes( encoding=crypto_serialization.Encoding.PEM, format=crypto_serialization.PrivateFormat.PKCS8, encryption_algorithm=crypto_serialization.NoEncryption() )) # Generate the public key from the private key public_key_pem = private_key.public_key().public_bytes( encoding=crypto_serialization.Encoding.PEM, format=crypto_serialization.PublicFormat.SubjectPublicKeyInfo ).decode('utf-8') 3. Actor Definition Next, we define the bot's Actor. The Actor is the bot's identity in the ActivityPub network. We use the Application type, as this entity is automated. # main.py (continued) # --- Actor Definition --- actor = Application( id=f"https://{HOST}/actor", name="Reminder Bot", preferredUsername=USER_ID, summary="A bot that sends you reminders. Mention me like: @reminder 5m Check the oven", inbox=f"https://{HOST}/inbox", # Endpoint for receiving activities outbox=f"https://{HOST}/outbox", # Endpoint for sending activities publicKey=CryptographicKey( id=f"https://{HOST}/actor#main-key", owner=f"https://{HOST}/actor", publicKeyPem=public_key_pem ) ) 4. Server Initialization We initialize the ActivityPubServer from apkit, providing it with a function to retrieve our Actor's keys for signing outgoing activities. # main.py (continued) # --- Key Retrieval Function --- async def get_keys_for_actor(identifier: str) -> list[ActorKey]: """Returns the key for a given Actor ID.""" if identifier == actor.id: return [ActorKey(key_id=actor.publicKey.id, private_key=private_key)] return [] # --- Server Initialization --- app = ActivityPubServer(apkit_config=AppConfig( actor_keys=get_keys_for_actor # Register the key retrieval function )) 5. In-Memory Storage and Cache To serve created activities, we need to store them somewhere. For simplicity, this example uses a basic in-memory dictionary as a store and a cache. In a production application, you would replace this with a persistent database (like SQLite or PostgreSQL) and a proper cache (like Redis). # main.py (continued) # --- In-memory Store and Cache --- ACTIVITY_STORE = {} # A simple dict to store created activities CACHE = {} # A cache for recently accessed activities CACHE_TTL = timedelta(minutes=5) # Cache expiration time (5 minutes) 6. Reminder Parsing and Sending Logic This is the core logic of our bot. The parse_reminder function uses a regular expression to extract the delay and message from a mention, and send_reminder schedules the notification. # main.py (continued) # --- Reminder Parsing Logic --- def parse_reminder(text: str) -> tuple[timedelta | None, str | None, str | None]: """Parses reminder text like '5m do something'.""" # ... (implementation omitted for brevity) # --- Reminder Sending Function --- async def send_reminder(ctx: Context, delay: timedelta, message: str, target_actor: APKitActor, original_note: Note): """Waits for a specified delay and then sends a reminder.""" logger.info(f"Scheduling reminder for {target_actor.id} in {delay}: '{message}'") await asyncio.sleep(delay.total_seconds()) # Asynchronously wait logger.info(f"Sending reminder to {target_actor.id}") # Create the reminder Note reminder_note = Note(...) # Wrap it in a Create activity reminder_create = Create(...) # Store the created activities ACTIVITY_STORE[reminder_note.id] = reminder_note ACTIVITY_STORE[reminder_create.id] = reminder_create # Send the activity to the target actor's inbox keys = await get_keys_for_actor(f"https://{HOST}/actor") await ctx.send(keys, target_actor, reminder_create) logger.info(f"Reminder sent to {target_actor.id}") 7. Endpoint Definitions We define the required ActivityPub endpoints. Since apkit is built on FastAPI, we can use standard FastAPI decorators. The main endpoints are: Webfinger: Allows users on other servers to discover the bot using an address like @user@host. This is a crucial first step for federation. /actor: Serves the bot's Actor object, which contains its profile information and public key. /inbox: The endpoint where the bot receives activities from other servers. apkit handles this route automatically, directing activities to the handlers we'll define in the next step. /outbox: A collection of the activities created by the bot. but this returns placeholder collection. /notes/{note_id} and /creates/{create_id}: Endpoints to serve specific objects created by the bot, allowing other servers to fetch them by their unique ID. Here is the code for defining these endpoints: # main.py (continued) # The inbox endpoint is handled by apkit automatically. app.inbox("/inbox") @app.webfinger() async def webfinger_endpoint(request: Request, acct: WebfingerResource) -> Response: """Handles Webfinger requests to make the bot discoverable.""" if not acct.url: # Handle resource queries like acct:user@host if acct.username == USER_ID and acct.host == HOST: link = WebfingerLink(rel="self", type="application/activity+json", href=actor.id) wf_result = WebfingerResult(subject=acct, links=[link]) return JSONResponse(wf_result.to_json(), media_type="application/jrd+json") else: # Handle resource queries using a URL if acct.url == f"https://{HOST}/actor": link = WebfingerLink(rel="self", type="application/activity+json", href=actor.id) wf_result = WebfingerResult(subject=acct, links=[link]) return JSONResponse(wf_result.to_json(), media_type="application/jrd+json") return JSONResponse({"message": "Not Found"}, status_code=404) @app.get("/actor") async def get_actor_endpoint(): """Serves the bot's Actor object.""" return ActivityResponse(actor) @app.get("/outbox") async def get_outbox_endpoint(): """Serves a collection of the bot's sent activities.""" items = sorted(ACTIVITY_STORE.values(), key=lambda x: x.id, reverse=True) outbox_collection = OrderedCollection( id=actor.outbox, totalItems=len(items), orderedItems=items ) return ActivityResponse(outbox_collection) @app.get("/notes/{note_id}") async def get_note_endpoint(note_id: uuid.UUID): """Serves a specific Note object, with caching.""" note_uri = f"https://{HOST}/notes/{note_id}" # Check cache first if note_uri in CACHE and (datetime.now() - CACHE[note_uri]["timestamp"]) < CACHE_TTL: return ActivityResponse(CACHE[note_uri]["activity"]) # If not in cache, get from store if note_uri in ACTIVITY_STORE: activity = ACTIVITY_STORE[note_uri] # Add to cache before returning CACHE[note_uri] = {"activity": activity, "timestamp": datetime.now()} return ActivityResponse(activity) return Response(status_code=404) # Not Found @app.get("/creates/{create_id}") async def get_create_endpoint(create_id: uuid.UUID): """Serves a specific Create activity, with caching.""" create_uri = f"https://{HOST}/creates/{create_id}" if create_uri in CACHE and (datetime.now() - CACHE[create_uri]["timestamp"]) < CACHE_TTL: return ActivityResponse(CACHE[create_uri]["activity"]) if create_uri in ACTIVITY_STORE: activity = ACTIVITY_STORE[create_uri] CACHE[create_uri] = {"activity": activity, "timestamp": datetime.now()} return ActivityResponse(activity) return Response(status_code=404) 8. Activity Handlers We use the @app.on() decorator to define handlers for specific activity types posted to our inbox. on_follow_activity: Automatically accepts Follow requests. on_create_activity: Parses incoming Create activities (specifically for Note objects) to schedule reminders. # main.py (continued) # Handler for Follow activities @app.on(Follow) async def on_follow_activity(ctx: Context): """Automatically accepts follow requests.""" # ... (implementation omitted for brevity) # Handler for Create activities @app.on(Create) async def on_create_activity(ctx: Context): """Parses mentions to schedule reminders.""" activity = ctx.activity # Ignore if it's not a Note if not (isinstance(activity, Create) and isinstance(activity.object, Note)): return Response(status_code=202) note = activity.object # Check if the bot was mentioned is_mentioned = any( isinstance(tag, Mention) and tag.href == actor.id for tag in (note.tag or []) ) if not is_mentioned: return Response(status_code=202) # ... (Parse reminder text) delay, message, time_str = parse_reminder(command_text) # If parsing is successful, schedule the reminder as a background task if delay and message and sender_actor: asyncio.create_task(send_reminder(ctx, delay, message, sender_actor, note)) reply_content = f"<p>✅ OK! I will remind you in {time_str}.</p>" else: # If parsing fails, send usage instructions reply_content = "<p>🤔 Sorry, I didn\'t understand. Please use the format: `@reminder [time] [message]`.</p><p>Example: `@reminder 10m Check the oven`</p>" # ... (Create and send the reply Note) 9. Running the Application Finally, we run the application using uvicorn. # main.py (continued) if __name__ == "__main__": import uvicorn logger.info("Starting uvicorn server...") uvicorn.run(app, host="0.0.0.0", port=8000) How to Run the Bot Set the HOST and USER_ID variables in main.py to match your environment. Run the server from your terminal: uvicorn main:app --host 0.0.0.0 --port 8000 Your bot will be running at http://0.0.0.0:8000. Now you can mention your bot from anywhere in the Fediverse (e.g., @reminder@your.host.com) to set a reminder. Next Steps This tutorial covers the basics of creating a simple ActivityPub bot. Since it only uses in-memory storage, all reminders will be lost on server restart. Here are some potential improvements: Persistent Storage: Replace the in-memory ACTIVITY_STORE with a database like SQLite or PostgreSQL. Robust Task Queuing: Use a dedicated task queue like Celery with a Redis or RabbitMQ broker to ensure reminders are not lost if the server restarts. Advanced Commands: Add support for more complex commands, such as recurring reminders. We hope this guide serves as a good starting point for building your own ActivityPub applications! https://fedi-libs.github.io/apkit/ https://github.com/fedi-libs/apkit https://github.com/AmaseCocoa/activitypub-reminder-bot
  • 0 Votes
    3 Posts
    48 Views
    @julian@activitypub.space It seems that NodeBB doesn't support the Mastodon API for polls. Since ActivityPub doesn't seem to support Poll natively, there needs to be an ActivityPub standard for poll voting. Otherwise, I suggest adopting the Mastodon API to improve compatibility with posts containing polls.
  • 0 Votes
    2 Posts
    14 Views
    Super stoked that Mastodon is rolling this out after many months of testing. That even a modicum of effort was put in to address the social failings of quote posting (as implemented on X/Twitter) is already a huge win for online public discourse.
  • 0 Votes
    2 Posts
    21 Views
    @hongminhee It's a place where our loosey goosey style goes into nondeterminism. We should tighten it up in the next version. My main answer would be: publishers, don't do that.
  • 0 Votes
    2 Posts
    11 Views
    This is what solidarity looks likehttps://thenexusofprivacy.net/what-solidarity-looks-like/(Part 2 of “Decentralization” and erasure: Blacksky, Bluesky, and the ATmosphere)@general @fediverse @fediversenews #blacksky #bluesky #activitypub
  • 0 Votes
    9 Posts
    53 Views
    trwnh@mastodon.social Yes you're right, some messiness is bound to happen. I'm not trying to force all implementations into a specific inheritance pattern, that's why it's a "should", not a "must". Even then one of my concerns is that while in an ideal scenario, everybody inheriting their parent context leads to an entire collection all referencing the same context... in reality a lot of messiness will occur, objects will reference other contexts all over the place, etc. At the end of the day it's best effort, and if we are able to handle all that and still get to a point where backfill is achievable, then that's a win in my books. > it depends on how much you embrace the idea of each publisher being allowed to make their own claims (and how much you allow "clean up" after the fact) Part of me would like this to not happen, but it is unavoidable.
  • 0 Votes
    3 Posts
    28 Views
    rimu@piefed.social that's surprising, isn't aguppe just a standard 1b12 community? What integration did you have to add?
  • On Discourse and Decentralisation

    Uncategorized activitypub
    3
    0 Votes
    3 Posts
    9 Views
    @ikuturso @fediversereport thanks! That was my experience also.
  • 0 Votes
    4 Posts
    15 Views
    Ah. Yes, I'll add that. Thanks!
  • OK friends!

    Uncategorized activitypub wordpress
    4
    0 Votes
    4 Posts
    25 Views
    @Edent @blog wooo

Gli ultimi otto messaggi ricevuti dalla Federazione
  • read more

  • Bavaria wants to move to Microsoft cloud by year-end

    Bavaria wants to equip its authorities with Microsoft 365. Critics expect license costs in the billions and warn of the loss of digital sovereignty.

    https://www.heise.de/en/news/Bavaria-wants-to-move-to-Microsoft-cloud-by-year-end-11066929.html?wt_mc=sm.red.ho.mastodon.mastodon.md_beitraege.md_beitraege&utm_source=mastodon

    read more

  • This is an important thing to consider, and why NodeBB decided to even pursue federation at all.

    It's arguable that we've reached the point at which forums cannot organically grow due to the ubiquity of social media. Depending on who you ask, we've reached that point 10+ years ago already.

    It's becoming increasing imperative that forums federate or risk dying due to attrition. Forums used to be the social network for niche topics. Facebook (with Groups) and Twitter (with hashtags) started competing, and Reddit (with subreddits) made another huge dent.

    There are some communities that fear integrating with AP will cause their local communities to become flooded with just anybody. Those fears are unjustified, but understandable.

    read more

  • Heckin' yeah it is. :sunglasses:

    read more

  • Hey! Thanks for the concise reply. There's a lot of technical stuff I can say about Discourse and such, but because I am the maintainer for NodeBB it is probably in my best interest to keep my mouth shut as we directly compete!

    Anyhow, the OrderedCollection stuff is actually all from me. I've been working as part of the Threadiverse working group to bring intercompatible formats to all threadiverse software, which besides Discourse and NodeBB, includes Lemmy, Piefed, and Mbin.

    The OrderedCollection enables software (like NodeBB) to quickly backfill entire topics. This is a huge problem on the microblog-side of the fediverse, and is not really a problem on the theadiverse, since there is already strong support for synchronization. However, smaller instances often do run into issues where they can't ever "catch up" on old posts because there's no way to get those posts. (e.g. start following a new community, you can't read any of the old content)

    To that end, Lemmy and Piefed have (or soon will) ship code to allow software to backfill using OrderedCollections. They don't use them yet, but they will provide them. It helps software like mine because I will then be able to see entire threads from communities I don't even know about or follow. It's a huge boost to discovery! :smile:

    > while Discourse decided to use an OrderedCollection, with the first item being the opening post.

    NodeBB also does this, but they're not incompatible per se. You'll see NodeBB topics showing up just fine on Lemmy and Piefed (see activitypub@community.nodebb.org or general), and that's because NodeBB does the extra step of announcing OP and replies, just like Lemmy/Piefed.

    Importantly, Discourse does this too, but because of the inability to find Discourse categories, I don't think it's easy to follow them. Chicken and egg, really. The way the AP integration in Discourse is built-out, it is more insular by design. Threads from Discourse only ever go out to the fediverse, you can't post in from the fediverse. That makes those communities much more insular by design and severely limits discovery.

    read more

  • @otters_raft I suggested on a few forums that I belong to that using the plugin might be useful and federating content would be a good idea - What I got back was fairly negative feedback - e.g people thought it would cause people who don't have accounts on the relevant forum being able to spam the forum and get around repuation requirements (not true I understand) - but also a kind of - query why federation would be useful - with an assumption people would just come to the forums and monitor them directly.
    read more

  • Thank you for the detailed explanation, that makes sense :)

    read more

  • Do you see any content on that page? I'm looking at !events@forum.fedimins.net but I don't see anything except the title and icon

    read more
Post suggeriti