I recently developed the initial version of Obsidian DEV Publish Plugin, a plugin that enables publishing Obsidian notes as articles on DEV. The first prototype was developed during a ~4 hour live stream.
When I was ready to try the plugin for real to create the first article, that feature worked for the first time! This was the first time ever, I loaded the plugin in Obsidian.
This was accomplished because almost every line of code had been developed using TDD. In fact, only some code that couldn't be tested automatically failed when trying to publish again, which should update the previously posted article.
This is not an uncommon effect I experience when practicing TDD, that the code works the first time. Not always; but often! When it doesn't work the first time; the issues are often relatively trivial; at least compared to non-TDD code.
Unfortunately, the Obsidian API does not make it easy to actually write meaningful tests of the part of the code that communicates with Obsidian, as the TypeScript types quickly bring in a web of dependencies our code really doesn't care about.
In this article, I will describe some of those problems, and the TypeScript tricks I applied to I solve them.
The Closed-Source Problem
Obsidian itself is closed source, which means that we cannot write tests that call into Obsidian code, unless we actually run the tests from within obsidian. The only thing we have access to as plugin developers are TypeScript type definitions describing the classes and functions available at run-time.
Running the tests suite inside Obsidian does defy one of the primary principles of TDD, that we should get the fastest possible feedback. If we discard that options, we are left with no option but to mock out everything handled by Obsidian, even helper functions that we would have liked to actually call from tests.
To clarify, I am not arguing against running tests inside Obsidian. Some complex plugins do just that. But the essence of TDD is about setting up a fast feedback cycle to achieve an efficient development process. So for a TDD process, it's not the right choice.
You may want to write tests for other reasons than the feedback loop. E.g. for a UI heavy plugin, TDD may not provide the right feedback cycle for a large part of the code base. In that case it would make perfect sense to add tests after a feature was implemented to prevent a regression.
Reading/Updating frontmatter
A feature of the plugin is that the first time you run it for a given note, a new article should be created on DEV. If you run it again on the same note, the article should be updated to reflect the latest note contents. To handle this, the plugin stores the DEV article id
in the frontmatter.
Reading and updating frontmatter is performed by the FileManager
class, which has the following declaration:
export class FileManager {
getNewFileParent(sourcePath: string, newFilePath?: string): TFolder;
renameFile(file: TAbstractFile, newPath: string): Promise<void>;
generateMarkdownLink(file: TFile, sourcePath: string, subpath?: string, alias?: string): string;
processFrontMatter(file: TFile, fn: (frontmatter: any) => void, options?: DataWriteOptions): Promise<void>;
getAvailablePathForAttachment(filename: string, sourcePath?: string): Promise<string>;
}
As Obsidian code is not available; we must provide some alternate implementation. If you're familiar with sinon, you might think we can create a stubbed instance like this:
const fileManagerStub = sinon.createStubbedInstance(FileManager);
fileManagerStub.processFrontMatter.callsFake(/* ... */);
But that is not possible when we don't have access to the class! But we can create a new class that conforms to the same interface, because a class
in TypeScript creates both a value at runtime (the actual class), and an interface at design time. The interface represents the functions and properties available on an instances of the class. It does not imply any inheritance relationship.
When a piece of code depends on a "class", the TypeScript compiler verifies that the argument is compatible with the interface, but not that it inherits from the class.
// FileManager in this scope refers to the *interface*, meaning that we can
// provide any argument that conforms to that interface.
class Publisher {
constructor(fileManager: FileManager) { /* ... */}
}
We can simply write a FakeFileManager
class that implements the FileManager
interface, and use that in test code, allowing us to express the desired behaviour:
let fileManager: FakeFileManager;
let publisher: Publisher;
beforeEach(() => {
const fileManager = new FakeFileManager();
const publisher = new Publisher(fileManager);
});
it("Should create an new article if the frontmatter has no article id", () => {
const file = fileManager.createFakeFile({
frontMatter: {
"dev-article-id": undefined
}
}); // Returns a TFile
await publisher.publish(file);
devGateway.create.should.have.been.calledOnce;
devGateway.update.should.not.have.been.called;
});
it("Should update an existing article if the frontmatter has an article id", () => {
const file = fileManager.createFakeFile({
frontMatter: {
"dev-article-id": 42
}
}); // Returns a TFile
await publisher.publish(file);
devGateway.update.should.have.been.calledOnceWith(match({ articleId: 42 }));
devGateway.create.should.not.have.been.called;
})
One slightly annoying issue in this approach is that the plugin only depends on one function in the interface, processFrontMatter
, but the function express a dependency to the FileManager
interface. To be compatible, the fake implementation must provide dummy implementations of all remaining functions, if the goal is to have strong types in test code too (it is!). It's not a big issue (the biggest is still to come), but it is noise in test code.
Fortunately, there is a solution to that. TypeScript supports duck-typing. I.e., if a class has the methods and properties defined in an interface, then the class is compatible with that interface. The implements
keyword can indicate that a class should implement an interface, but is not required. It merely helps when it is the intent that a class should conform to a specific interface, as you get more helpful compiler errors when it doesn't.
I use this to my advantage and turn the problem upside down. Rather than writing a class that conforms to Obsidian's interface; I can make Obsidian's FileManager
class conform to my interface!
interface GenericFileManager {
processFrontMatter(file: TFile, fn: (frontmatter: any) => void, options?: DataWriteOptions): Promise<void>;
}
export class Publisher {
fileManager: GenericFileManager;
constructor(fileManager: GenericFileManager) {
this.fileManager = fileManager;
}
async publish(file: TFile) {
/* ... */
}
}
export class MainPlugin /* ... */ {
async publish(file: TFile) {
new Publisher(this.app.fileManager).publish(file)
}
}
I have now organised the code such that the closed-source FileManager
class is compatible with my own GenericFileManager
interface. I can now write the test that express the desired behaviour, and the compiler does not force me to write dummy method implementations I never call from the plugin.
It is actually more profound than that. By turning the problem upside down, I have made the code more conformant with the interface segregation principle, which states that "no code should be forced to depend on methods it does not use". When the plugin depended on the FileManager
interface, it was forced to depend on 5 functions. Now it only depend on the one being used. The fact that the real implementation provides more capabilities than needed is of no concern; nor does it affect the maintainability of the code.
But there is a worse offender of ISP, the TFile
representing a file in Obsidian.
Generalising on TFile
Despite being able to remove the unneeded methods from our direct dependencies, we cannot ignore that processFrontMatter
accepts an input of type TFile
, which is now an indirect dependency of the plugin. This is unfortunate, as it brings a cascade of indirect dependencies.
export class TFile extends TAbstractFile {
stat: FileStats;
basename: string;
extension: string;
}
export abstract class TAbstractFile {
vault: Vault;
path: string;
name: string;
parent: TFolder | null;
}
Arg! 😱 TFile
has a dependency to Vault
, which is everything in Obsidian. The plugin code does not depend on any of the properties of TFile
, the plugin just passes the value around to other Obsidian functions, such as FileManager.processFrontMatter
.
How can we get rid of the dependencies to the Vault
? By making the GenericFileManager
... generic.
interface GenericFileManager<TFile> {
processFrontMatter(file: TFile, fn: (frontmatter: any) => void): Promise<void>;
}
With this declaration, implementing the FakeFileManager
is extremely simple (and also a FakeVault
as the Vault
provides the functionality to read the contents of the file, but I will ignore that in all other parts of this article to keep things simple. The problem and solution is identical to the FileManager
)
type FakeFile {
frontmatter: any;
contents: string;
}
class FakeFileManager implements GenericFileManager<FakeFile> {
processFrontMatter(file: TFile, fn: (frontmatter:any) => void) {
fn(file.frontMatter);
return Promise.resolve();
}
}
The Publisher
class, which depended on this interface is now also forced to be generic, but that's simple enough:
class Publisher<TFile> {
fileManager: GenericFileManager<TFile>;
constructor(fileManager: GenericFileManager<TFile) {
this.fileManager = fileManager;
}
async getFrontMatter(file:TFile) {
return new Promise((resolve, reject) => {
this.fileManager.processFrontMatter(resolve).catch(reject)
}
}
async publish(file: TFile) {
const frontMatter = await getFrontMatter(file);
const articleId = frontMatter['dev-article-id'];
if (typeof articleId === 'number') {
await update(/* ... */)
} else {
await create(/* ... */)
}
}
}
The code above basically says. The Publisher
needs to be constructed with some fileManager
. It also has a function, publish
that must receive a file
as input. The Publisher
itself neither knows, nor cares, what a file is, it just cares that whatever it receives is something that the fileManager
knows how to deal with.
In test code the plugin is constructed with fake implementations:
const fakeFileManager = new FakeFileManager()
const publisher = new Publisher(fakeFileManager)
In the main plugin file, the plugin is constructed with the real implementation:
const publisher = new Publisher(this.app.fileManager)
I didn't even need to specify the generic type argument for the Publisher
constructor, that was just inferred from the passed arguments.
Now ISP is followed; no plugin code has any dependency to any function or property it doesn't need. I can replace closed source classes in tests with minimal fakes that just simulate the behaviour I care about for for the feature I am testing.
That is how I was able to write a plugin, where 50% of the functionality worked the first time it was loaded in Obsidian, and the other 50% required two or three easily identified lines of code to change, before working also.
Optionally Including Some Properties of TFile
.
I will add a hypothetical case which is not relevant for this plugin, but could be for readers wanting to adapt this approach.
If your plugin depends on some properties of TFile
, for example basename
and extension
, you can add a type constraint to the generic type.
/**
* Represents properties of a TFile that _our_ plugin depends on
*/
type GenericFile = {
basename: string;
extension: string;
}
/**
* A valid implementation of GenricFile that is used just for test
*/
type FakeFile = GenericFile & {
// Add whatever properties are relevant for the test doubles
frontmatter: any;
}
class MyPluginLogic<TFile extends GenericFile> {
processFile(file: TFile) {
// in this scope, we can rely on the file having a basename and extension property.
}
}
This tells the compiler that you only types that have the a basename:string
and extension:string
property can be used as generic type arguments, so now the code can safely use these two properties.
As the real TFile
has these properties, that is not a problem. The compiler will force us to add them to the FakeFile
implementation, but they would already exist, because the entire point was to enable the practice of TDD, which means you would add them to the fake implementation before actually writing the code that depends on them ;)
Coming Up: Dealing with Inheritance
To create an Obsidian plugin, you cannot avoid creating classes that inherit from classes in the the closed-source part of Obsidian, as a minimum for the main plugin class must inherit from Obsidian's Plugin
class. In my case, I didn't test the main plugin class as there is virtually no complexity. But there are other cases where it may be necessary.
This makes it seemingly impossible to actually create this instance in test code; how can you create an instance of a class if the base class is not accessible? But it is actually possible; a topic I will be covering in an upcoming article.
But the gist of it is, a class in JavaScript is just an identifier in the current scope. It is a reference to a function (the constructor). Functions can receive functions as arguments, and they can return functions, which implies they can also receive and return classes. This makes it possible to write a function that creates a class (not an instance, a class) - and this function could potentially use a parameter as the base class for the constructed class. The JavaScript extends
keyword simply operates on an identifier in the current scope.
I will write a proper article with code examples, including the TypeScript types necessary for this to compile correctly.
p.s.
This article was of course written in Obsidian, and published using the Obsidian DEV publish plugin (and then fixing some things that the plugin doesn't yet handle).
The plugin is not yet available in the official community plugin list - but it can be installed using BRAT. Be aware of how your DEV api keys are stored before using it.
The live stream is currently available on twitch.tv/stroiman. I will eventually publish it to YouTube/@stroiman.development, with pauses removed.
Have mercy on me, I am still a total noob in regards to streaming and video content.
Top comments (0)