Trigger Drafts backup to file on disk from url

I have become very fond of the .draftsExport backup files, which I import into a database, embed using a large language model and can semantically search and pass to other large language models as a way of aggregating and discovering things I’ve previously written in my journal.

Within Drafts, I can open Preferences → Backups → Backup now and save this file to a chosen location. Is there a way to trigger this backup from e.g. a url? Right now this is a manual step for me, but if I could trigger it remotely, I could have the entire export → add to database as a single integrated workflow. I didn’t find an action that did this. If found, I could trigger the action via url and the problem would be solved. (An alternative I have found is to use a daily schedule and then in my script find the .draftsExport that was last created. That means I have to rely on that schedule, though, and it could take up to 24 hours after my last changes to my drafts.

Please forgive me if the answer is obvious or if this has been asked before. I haven’t been able to find it on the forum, although I found many about exporting drafts to separate files (which I don’t want - I want all the metadata contained in a nice structure in the .draftsExport file).

1 Like

I don’t recall seeing any URL option, Shortcuts or scripting option to trigger a backup. But there are a few questions it leads me to.

  1. How often do you need to import data? More than daily for a journal?
  2. Why import a full backup when all you would really need is the delta?

The backup file is a JSON array of a set of data about each draft. You could create an action that checks a last run persistent variable, builds the JSON array for each modified draft, updates the variable, and puts the resulting delta backup file wherever you want it.

Actions can be triggered via URL, or via Shortcuts - which has built in scheduling on iOS, and there are many scheduling options on the Mac where you could call via the command line.

Not sure the details you are working with…I might imagine you could be more efficient incrementally ingesting data. Maybe make a workspace that contains only drafts modified in the last n days and have an action that could export the content of that workspace as a JSON file – and an action could easily be triggered by a URL or Shortcut periodically.

That would mean you didn’t not have to re-process your entire corpus of data every time, perhaps. Again, depends on what you are doing on the back end.

But, yes, there is not currently a way to trigger a full backup via automation – but an action could be written that wrote out a similar JSON file off all drafts.

Have you written anything about your set-up? I’d be interested in experimenting with something like this…

Thanks for all of your replies! Your ideas put me on track to achieve what I wanted. I created my own action, found all drafts with the Draft.query, exported it to the json format and saved it with the local file manager. In my outside script, I can now call a url to perform that action and copy the newly saved file. This gives me a sorted json I can put under version control if I want to, and I can add any updated drafts to the database for searching and embedding.

For those interested, this is the action I have for now

// Function to format date as YYYY-MM-DD
function formatDate(date) {
    return date.getFullYear() + '-' +
           ('0' + (date.getMonth() + 1)).slice(-2) + '-' +
           ('0' + date.getDate()).slice(-2);
}

// Get current date for filename
let currentDate = formatDate(new Date());

// Use Draft.query to get drafts
let queryString = "";  // Empty query string to get all drafts
let filter = "all";    // Get drafts from all locations
let tags = [];         // No specific tags required
let omitTags = [];     // No tags to omit
let sort = "created"; // Sort by modified date
let sortDescending = true;  // Most recent first
let sortFlaggedToTop = false;  // Don't sort flagged to top

let drafts = Draft.query(queryString, filter, tags, omitTags, sort, sortDescending, sortFlaggedToTop);

// Limit to the 50 most recent drafts
// drafts = drafts.slice(0, 50);

// Use toJSON method to get the draft data
let exportArray = drafts.map(draft => draft.toJSON());


// Convert to JSON string
let jsonString = JSON.stringify(exportArray, null, 2);

// Create file name
let fileName = "/" + `drafts_latest.draftsExport`;

let fmLocal = FileManager.createLocal(); // Local file in app container
let success = fmLocal.writeString(fileName, jsonString);

if (!success) {
    console.log(`Failed to save the export file. Error: ${fileManager.lastError}`);
    alert("Failed to save the export file. Check the console for details.");
}

And this shell script obtains the saved json

# Call this url to export all drafts

URL="drafts://x-callback-url/runAction?text=nada&action=export_all"

open $URL

# Define the username variable

USERNAME=$(whoami)

# Construct the backup file path using the username variable

BACKUP_FILE="/Users/$USERNAME/Library/Containers/com.agiletortoise.Drafts-OSX/Data/Documents/drafts_latest.draftsExport"

# Copy backup file to current directory

cp $BACKUP_FILE .
1 Like

No, I haven’t. I’m using sqlite utils to put the json content into a database, and this llm cli tool to embed it with a locally stored model.

1 Like