In my plugin I have to write downloaded files into the vault. Those files can be small or really big.
In the begining I worked with
writeBinary(normalizedPath: string, data: ArrayBuffer, options?: DataWriteOptions): Promise<void>;
This works well for small files but as the data parameter suggets this is not going to work well with large files. Using this function it will load the complete file into memory.
The good thing is that files written like that are immediately available afterwards for use and I don’t have to take care what happens if the write process is interrupted.
Use case or problem
Downloading S3 files such as images, audio and videos from S3 buckets and use them in markdown files.
Proposed solution
-
I suggest introducing a function that writes binary files as streams, closely mirroring the function already in place.
-
If this approach isn’t viable, a deeper dive into Obsidian’s reload process might be required. For some unknown reason, reloading Obsidian doesn’t interrupt active streams.
-
It would be beneficial if there was an event to monitor Obsidian reloads. This would enable plugin developers to appropriately manage and close any open streams. Thankfully, stream closure is already operational when the
unload
event gets triggered.
Current workaround
Currently, I’ve crafted my own function to write files using streams:
return new Promise((resolve, reject) => {
const writeStream = createWriteStream(objectPath);
const cache = this;
this.addOpenStream(writeStream);
writeStream.on("finish", function () {
cache.removeOpenStream(writeStream);
resolve();
});
writeStream.on("error", function () {
cache.removeOpenStream(writeStream);
reject();
});
stream.on("error", function () {
cache.removeOpenStream(writeStream);
reject();
});
stream.pipe(writeStream);
});
This effectively writes a readable stream to the specified path. But it has its limitations:
- Immediate Availability: Unlike the earlier method, the file isn’t instantly accessible. To mitigate this, I’ve implemented a retry mechanism:
async function getAbstractFileWithRetry(
path: string,
retries = 10,
interval = 100
): Promise<TFile | null> {
for (let i = 0; i < retries; i++) {
const file = app.vault.getAbstractFileByPath(path);
if (file) {
return file as TFile;
}
await new Promise((res) => setTimeout(res, interval));
}
return null;
}
- Stream Management: Presently, there’s no clear method to manage open streams if Obsidian undergoes a reload. While stream management is functional when the plugin is deactivated and the event
onunload
is fired and when Obsidian is fully shut down, there’s a loophole. If users reload Obsidian while a stream is active, it results in files being locked in a peculiar state, rendering them inaccessible by Obsidian. These files seem to get protected, requiring administrator permissions to delete. However, fully closing Obsidian restores the files’ accessibility.
By addressing these issues, we can achieve a more seamless and reliable process for writing and accessing files in the plugin.
Link to my plugin in its current state - https://github.com/RagedUnicorn/obsidian-plugin-s3-link/blob/master/src/cache.ts#L61