Neumes were some of the first forms of music notation, the predecessor of modern musical notes. It was early technology that empowered those that could decipher it with the knowledge of how to play any music written in it.
HIFI Labs launched the musicOS project last month, with a focus on providing a homepage for Web3 Music by building an application on top of open data.
Through our discovery phase we’ve realised that it will be beneficial to separate the data layer out under its own name and organisation, giving it full freedom to operate in a decentralised manner and be entirely open source. We call this neume and are supporting it as a standalone project.
There’s a careful balance to strike when straddling product discovery and open-source; perceived conflicts, promoting genuine positive engagement, while ensuring that momentum and funding are in place to build what’s needed. By supporting open source projects alongside musicOS, HIFI Labs is committing to the future of public goods while ensuring speedy and iterative application development.
We believe that open source is fundamental to the future of Web3 and we want to lead by example. To this end, Dan has joined HIFI Labs to drive forward open source projects; to focus on initiating, supporting, and backing open source development, and working with the HIFI team to look at how we can work in more collaborative ways across all our activities. Tim is leading the technical development of neume.
neume’s goal is to build a socially scalable and open-source metadata retrieval, indexing, and management infrastructure for Web3 Music.
neume is tasked with indexing all activity within the emerging Web3 Music industry. It will provide the infrastructure needed to easily spin up platforms that experiment with showcasing Web3 Music and the endless potential of deeper connections between Artists and Fans.
Activity within neume, at least within the first three months, can be categorised across three distinct areas:
Our documents within the neume github provide a deeper look at the current and planned technical and social architecture.
The focus within neume is to build infrastructure that will underpin future platforms in a public and open way, so that we don’t repeat the mistakes of the current music industry where data is siloed and innovation is stymied.
The defining quality of the neume project is its drive towards increasing social scalability through facilitating an implementation process that seeds and builds the momentum of decentralised contribution through network effects.
Social Scalability is the idea that modularisation, trust minimisation, standardisation, and an integrative approach toward dealing with conflicts can vastly accelerate and, most importantly, scale an open source project’s number of participants:
We’re conscious of our approach’s apparent fragility at the moment, for example, “Why aren’t you using IPFS? What about TheGraph?”. Through implementing prototypes that interface with both of these technologies and by talking to our peers, it’s clear that significant foundational effort is required to bring the music industry’s infrastructure into the 21st century and therefore a novel approach is required.
In the meaning of Glen Weyl’s “Radical Markets”, we at neume are “music radicals”. To bring about productive change and accommodation for the changing media landscape, we’re committing ourselves to help with the best of our abilities.
Regarding the data types that neume will seek to provide access to, the first step is around activity generated through the sale of music NFTs. As artists continue to experiment with social tokens they too represent an area of significant value for aggregation and indexing.
Thinking longer-term, identity and its resolution across multiple accounts, social interactions (follows, playlists), next-generation NFTs (such as “account-bound” tokens), all present interesting opportunities to further the contextual picture that neume can provide to developers.
Furthermore, once a foundation has been established, we can look into things like the right to state claims (for example, identity x a claim = relative authority around that claim), thinking about licensing relative to web3 music, e.g. can we build standard licenses that people can use similar to the different popular OSS license types.
As has been said many times, we are very early on this collective journey towards forging a new music industry. neume represents an opportunity to build in the true spirit of Web3; a credible neutral public good.
Activity towards this goal is already underway and we’d love your help.
Success for neume comes through collaboration; to this end we are rolling out an Open Tasks Board that catalogues issues available for open source development. Equally, we’d love to chat if you have any ideas or think you can assist in its development in a more involved capacity.
Dan
]]>In this post, I'm elaborating on the most pressing challenges we're currently facing in the neume network. And just for context, this isn't a conclusive post. We expect more technical challenges as we're building this: But with this post, I want to make the broader neume network community aware of the problems we're currently hitting when developing neume.
But first of all: if you end up finding what's written here interesting and you're thinking of contributing: We have a contributors guide on GitHub, and you can get paid.
Neume's primary goal is to enable Hifilabs to ship the first version of music-os. For that, the neume crawl shall produce new data roughly every five minutes from sound.xyz and catalog (v1 and v2). We've agreed that it's sufficient to dump this data into the neume-network/data repository. Additionally, we want to structure sound.xyz's data download so that it is "editions-aware," meaning, e.g., soundxyz track titles don't include the "#3" edition number anymore.
Finally, we want each music NFT to have a universal and unique identifier. In talking with the music-os team, we've agreed that the priority of those issues is as follows:
Now, from my side as the project's architect, I want to second and highlight the importance of these goals. Within the next few weeks, we want to start having these issues addressed as it enables a real customer (Hifilabs) to ship a real product (music-os). So it is our chance to build something useful people want, and we must take advantage of it.
Given that I've now established the project's context, here are technical challenges and generally components that need work in the neume ecosystem.
Currently, neume-network uses Spinamp's minimal web3 music NFT subgraph to download all of Ethereum mainnet's tracks from platforms like catalog, sound, and NOIZD. However, we ultimately don't want to rely on this level of intermediation for receiving our data. We want to minimize the amount of infrastructure necessary to run the neume network: Ideally, an Ethereum full node suffices. Additionally, there are benefits from interacting with immutable contracts: They never change.
Spinamp's web3 music NFT subgraph currently crawls the three platforms mentioned above: But neume's mission is greater than that: It's a socially scalable extraction and transformation engine for web3, and so supporting as many platforms, and dapp's data as possible must be our target.
For that matter, we're aiming to replace the subgraph strategy with directly extracting blocks from a certain block height and then extracting their logs on a case-by-case basis, e.g., depending on the emitted topics.
Then, if you're looking at our code and the technical overview of neume's architecture - you'll find that we're currently writing in flat files. So far, this had the benefit of not committing to, e.g., a database schema. But it's starting to bother me that we haven't figured out a good way of storing and ordering data in neume.
For now, each strategy can write directly to disk into a flat-file. So, essentially, we're appending to files, and the outcome is what I'm showing over at TimDaub/temp-neume-data-dir-results.
However, this type of handling of new data has the problems:
We can't generate an order for the data that we've successfully extracted and transformed (e.g., we can't sort by "newest songs")
We still do not understand what type of database we want to use. However, we know what we want: It has to be embedded, meaning it cannot ask for additional installation or configuration of a neume network user. And it has to be able to handle many writes - ideally concurrently and even within separate threads.
So to recap, another unsolved problem in the neume network ecosystem is integrating a fitting embedded database into the stack towards enabling Hifilabs's music-os use-case of showing "the latest tracks."
Another problem is that we're currently doing a bad job controlling the onboarding experience when finding and wanting to use the neume network. Already we've had three or four seriously interested users that ended up being, e.g., confused about our way of working with git submodules and npm packages.
In my opinion, this is not so much about teaching users how to do things. And either is it about artificially simplifying the stack so that everybody can use the tool. It may also not be about documentation. Rather it is about us as the neume core contributors being able to control the user experience, meaning that, e.g., we're immediately aware of issues when we push new code.
So this problem is about building scalable mechanisms for working together as a team of async workers. E.g., can we run a nightly integration test for neume-network/core that tests the entire user flow until finishing downloading all the data? What problems did the user experience? Did they end up reporting them to our GitHub repository? Did they find our Discord channel? How well-received were our existing documents?
Controlling the user experience during onboarding means being vulnerable to embarrassment and creating issues that you don't want to see publicly. But, on the other hand, when we do this frequently and adamantly, we get great open source software that changes people's lives because it is useful.
A "Living System" is one that grows into its environment, by self-organizing around opportunities. Living systems can last for a long time, adapt well to change, and thus be highly successful. By contrast, "Planned Systems" tend to be fragile, poor at coping with change, and thus short-lived.
Neume must be built to become a living system or an organism. We can't just ship computer code - that'd be unsustainable as it'd stop working the day we discontinued maintaining it. Rather, to truly build something useful, we must strive to create the neume network as an organism or a living system. One that can adapt to the challenges ahead that can grow and shrink according to its environmental factors: And one that tends to have more upside than downside (anti-fragile): It should be default-alive.
Throughout countless meetings and working together for months now, we've been trying to not only continuously ship the neume network software: but also the neume network social architecture that details how we collaborate.
Specifically, e.g., the universally unique identifiers, the work on a track's normalized schema, and properly parse and integrate soundxyz's edition data: We've now created neuIP, a standards track allowing us to express and reflect on consensus decisions in the community.
With the neume network, we're on the brink of shipping useful software for Hifilabs and the broader Ethereum and music NFT ecosystem. However, we're facing several technical and social challenges, like not downloading Ethereum blocks directly from their source, not having a clear database directive, not controlling the user experience, and building a scalable social system.
At neume, we're capable of welcoming new contributors quickly through a staged process of engagement where tomorrow, you could get paid for contributing. So we don't only want to advertise the neume network as socially scalable: We also want to do it and think that in its current form, the existing software architecture can absorb more eager brains.
If you're a neume regular, we eagerly await your tickets to solve the above problems. If you're new to neume and this read was interesting: Head over to our Discord and say "hi!".
Tim
]]>Currently we have three preferred communication channels:
Any and all questions around the project can be asked, across #neume-dev (technical) and #neume-chat (general). The neume core team often provide ad-hoc updates and requests to the community for discussion.
An opportunity for the community to ask specific questions and spark discussion around ideas in a voice chat format. Previous open office hours are recorded and saved in the #neume-achives channel on discord.
The neume core team are also relatively active on Twitter if that is your preferred format (Tim, Dan), and we plan to ramp up activity through the neume Twitter in good time.
We are constantly trying to improve the processes by which anyone can contribute to neume, detailed in a couple of key living documents:
Further to task-wise contribution we have also established a process for "neuIPs" (neume improvement proposals) where the community can suggest more fundamental changes to the protocol, as defined in neuIP-1.
neume is useful when it is being utilised in real world experimentation; core to our development mentality is letting use cases drive the prioritisation of building.
The world of blockchain/crypto/web3 is a constantly evolving and ever-changing environment and while we feel like we have a pretty good idea around what neume needs to become to support the next generation of music and the creative industries, there will no doubt be examples along the way that we can build towards to accelerate that journey.
If you feel like neume is something that could enable your idea, then please reach out across any of the channels detailed above and we can discuss further.
Dan
]]>In the following post, I want to highlight a problem we've encountered in the music NFT industry. As a quick aside, these days, I work as a tech lead on neume.network, and we're essentially building a socially scalable data pipeline to extract and augment on-chain data. For all of the music NFTs hosted on Ethereum, in contrast to the Graph Protocol, at Neume, we're not trying to make contract storage data more accessible. Instead, we want to build a list of music NFTs that users can listen to via an app like Spotify.
As it's a lot of work to extract music metadata from smart contracts, today we got some users' feedback, and to my surprise, it sounded like we had unconsciously optimized for indexing the big curated music NFT platforms like sound.xyz, beta.catalog.works, zora.co, and mintsongs.com V2.
But truly, this was never our intention: But I was caught off guard by the user feedback, and so this post is somewhat a reflection on what happened and led to problematic emergent effects.
See, for musicos.xyz, as it's an audio player like Spotify, we naturally wanted to index the best possible and probably also most popular platforms. So when starting the project, we intuitively went for the popular ones - those that we knew provided nicely sounding songs.
And when they launched musicos a few days ago, I suddenly found myself confronted with the following user feedback questioning the credible neutrality of the Neume Network indexer. Here's what users said:
Additionally, an artist I am I fan of had this to say on Twitter:
So let us unpack what is being discussed here. In my own words: Guspy, Supersigil, and thepark.eth are all asking for their tracks to be listed on the music-os app we had just released. And they did so by pointing out an interesting dynamic in the music NFT space. guspy mentions that the music they had minted through a custom contract on manifold.xyz isn't showing up on music-os. They're saying that having their contract being indexed is posing a challenge: And that instead, had they minted their song on Catalog, Sound, or MintSongs, then it probably would have made it into music-os.
Extrapolating their statements here, too, we can reason that any track that gets lots of exposure through a music player may fare better on first and secondary sales. So potentially, for guspy et al., having their songs exposed on music-os can potentially mean an improvement in income. Or anyways that their songs are listened to at all.
In the post by Supersigil below, they help double down on the argument, namely that artists need to be given the option to "be their own platforms" and that hence the favoritism of Neume Network to implement the big music NFT platforms first is a challenge to all artists hosting independent contracts.
There's also an interesting insight in their posts: Namely that the platforms are curated by genre or type of music. And so, despite some artists' work having the potential for popularity - they may never end up being exposed to larger audiences given the moderation policies of the respective platforms.
And there's actually more context to unpack here: Historically, the big platforms like sound.xyz and catalog.works have been heavily curated - or at least that's how their music sounds to my ears. I can back this up with visitor data, too, as I had played around with a minimalist website that allowed users to listen to sound.xyz and catalog.work's songs on a website I called tracks.lol.
For a brief moment in time after its release, it gathered a notable listener community and was widely shared on Twitter despite the website having any meaningful web design or other functionality. Below's a screenshot of the page, or you can see it yourself by visiting: tracks.lol:
And what my intention had been here is that I had been experimenting in the true sense, and so I carefully controlled the website's utility to test the music's popularity.
As you can see, there weren't any fancy animations - no marketing website to convert users. All there was a single page that played music from sound.xyz and catalog - with the hypothesis being that people would still like this arguably "shitty" website because it played nice music! And oh boy, did they! Here's a screenshot of my plausible.io tracker, and you can access the numbers yourself by clicking the following link plausible.io/tracks.lol.
So clearly, it couldn't have been my sick web design skills or the amazing utility you got from the website's controls. The reason people briefly shared the page was the music on it was nice.
And let me reassure you, this is also the qualitative feedback I'm gathering from anyone that I manage to expose to sound.xyz and catalog's tracks! They're doing a great job in properly curating NFTs to seed the initial consumption network.
And I'm capable of negating that argument, too: Where if you'd built a website that focuses on listing the highest grossing NFT sales on OpenSea, you'd end up not with an aesthetically pleasing newsfeed of NFTs: You'd just get a seemingly random list of Bored Apes, and, in fact, I can prove this to you right away by asking you to visit: context.app/trending where at the time of writing, the four latest updates in the feed are apes - boring!
But despite curation platforms like sound.xyz and catalog.works accelerating the music NFT industry in the first place, there is a sort of tragic story in this utility provision too, which are the problems pointed out today by thepark.eth, Supersigil, and guspy: Namely, that while curation can excel an artist's work, it's also gatekeeping other artists from putting their latest tracks in front of a larger audience. And it discriminates against genres.
In the end: This capability to curate who and what song is gonna be popular comes with a lot of influence, and so: curating, gatekeeping, and/or censoring: Those actions create power structures. Structures that are actively being misused today to seek further self-serving profit.
It is actually well documented, and just a recent prominent example is CoryxKenshin's rebellion about Youtube's arbitrariness, favoritism, and racism in moderating potentially harmful content. It highlights a few faults.
The fact that those who moderate do so opaquely - with non-consistent guideline interpretation. With potential implicit bias and little public accountability. Governance, as we say in the crypto-sphere, doesn't seem to be a meaningful keyword.
But OK - what has any of this to do with EIP-4973 and Account-bound tokens? And yes, at least for now, this post has been overfocusing on the problems and not a solution. It's a principle I work by called "problem-driven development," and so now, since we're sufficiently informed, let's discuss a potential solution.
A thesis I have been pursuing with rugpullindex.com, the Neume Network, and now also with Account-bound tokens is how infrastructure provision and making it broadly available to any players in a market can produce a wealth-transfer or generally broader equity. With rugpullindex.com I've seen this because others built mobile apps based on my API. With Neume Network, we'll see this effect emerge as developers are capable today of replicating the musicos.xyz experience by using our GPL-3 licensed Neume Network crawler or by simply using our data set at neume-network/data.
As pioneered by Sabre and Amadeus in the airline industry, by commoditizing and co-owning the infrastructure - namely the database that holds all future flights - similarly, we're able to create competition around the supply of good music NFT metadata offerings: And we postpone what Ben Thompson calls "Aggregation Theory," an effect of profit-seeking and monopolizing a market's supply side.
But enough with the theory: Simply put, what the above means is that EIP-4973 can make the data structure for consensual music curation available to anyone with a wallet using account-bound tokens to express agreements on-chain. And it simultaneously removes relevancy from the big curation platforms like sound.xyz and catalog. So how would it work?
See EIP-4973 is a truly peer-to-peer contract in the sense that no single individual or group has different privileges when interacting with the contract. This isn't true for many contracts, by the way! Most EIP-20 contracts implement permissioned minting, and so do EIP-721 tokens to preserve artificial scarcity.
But EIP-4973 doesn't implement such hierarchical logic. It's flat, and instead,
for two addresses (EOAs or contracts), if both addresses provide a valid signed
message, then an Agreement
over a document hosted at string tokenURI
may be
etched to the chain.
That storing of a consensual agreement can be done two ways - and surprisingly,
it's NOT done through minting, but instead, we implement two bulky functions
called function give
and function take
. Here's a drawing of how they work.
from
can give
an ABT to to
and is the sender of the transaction.to
can take
an ABT from from
and is the transaction's sender.I'm linking the reference implementation code snippets below in case you want to dive deeper. But for continuing to read this post, it's not necessary to understand them deeply.
To clarify: The result of both of these functions is that a new token is minted
to address to
: And their validation method is similar. Namely, both need two
valid signatures of string tokenURI
from address from
and address to
.
Their difference is who represents the active part on-chain and who just
provides a signature. The figure below outlines the difference in both cases:
For function give(address to, ...)
, address from
must wait for address to
's signature to arrive and can only then send the transaction on-chain to
cement their agreement. Whereas in the case of function take(address from, ...)
, address to
takes the ABT from address from
and hence has to include
their signature.
So these primitive functions have been deliberately called "give" and "take." It's because, in the World of Warcraft universe, where Soulbound items were first implemented and since this in-game metaphor has become EIP-4973's baseline, players could also "give" and "take" certain items.
In case a player completed a quest, the NPC often "gave" the player items: But it was at the player's discretion to preserve them in their bag. Likewise, although this challenges the "consent" metaphor, players could "take" items from a dragon or firelord they had slain earlier.
So going back to our initial story of how EIP-4973 Account-bound tokens can help the music NFT industry's curation problem, here's what I'd like to say:
Right now: It is a one-way street where artists are practically dire for having their tracks minted on popular curation websites. It's also a problem as the infrastructure for curation is proprietized - so assuming this moat strengthens further over time: It'll also negatively raise the bar for unestablished artists to publish.
And we must acknowledge that the above-mentioned dynamic won't go away overnight too. Instead, given our thesis that commoditizing infrastructure can break markets, I think by using EIP-4973 for consensual on-chain curation of music NFT playlists, I think we could achieve more equitable outcomes if more people could become curators.
It is because standard adoption can initially level the playing field of access to attention. Instead of one website being able to promote their tracks and build potentially proprietary solutions, with EIP-4973, we have a primitive that can express bidirectional agreements between curator and artist.
And sure, we could implement contracts where the curator is in charge of administering the listing. But I think that's an unwise design, given how reproducible content is curated nowadays. Rather, if the authors and the curators mutually agreed on which tracks end up on what lists - I believe this would carry a higher signal when compared to curator-only feeds.
Additionally, by making the curation infrastructure usable by anyone, a greater degree of competition would improve outcomes and reduce the risk of a single party monopolizing.
Wide-scale adoption of this standard would incentivize indexers and other NFT infrastructure providers to implement its interfaces and make the on-chain data symbolizing publicly-visible agreements broadly available.
If you're deep in the traditional music industry, you probably know how complicated moral rights can get - and here's a standard that can potentially solve some of this domain's questions.
So how would this end up looking? Here's a sketch:
event Transfer(address from, address to, uint256 tokenId)
is sent.Now, it'd be great if, additionally, a "curation release signal" could be emitted that clarifies to music NFT indexers when a new song is released. And that could happen after having reached an agreement between the artist and curator.
Remember, minting an NFT is not laying claim to license your work permissively. An artist retains all copyright, and so technically speaking, unless there's a specific agreement put in place, reproduction of a music NFT is simply a legal grey zone many seem to tolerate for now.
Hence this whole essay on chainifying two parties' agreement - Since the
string tokenURI
's content would detail the conditions under which curator and
artist have decided to collaborate to release the NFT.
For the curator, there's no clarity on whether they're allowed to reproduce the artist's song on their site. And for the artist, there's transparency regarding what can be done with their work and what can't.
So that's it, that's why EIP-4973 Account-bound tokens could drive curation in the music NFT industry. Anyways, this is a very long post, so I'd now like to start concluding it.
In this post, we're documenting the problems of the emerging music NFT industry and how the aggregation of music NFT supply creates suboptimal outcomes for independent artists. Our thesis is that well-curated media content is useful to anyone but that it can also easily lead to power abuse.
To combat this potential negativity and to "break the curation market," our thesis is that the commoditization of infrastructure can help level the playing field. And specifically, it means that by mainstreaming consensual content curation, a new ecosystem of dApps could emerge that serves curators and artists alike.
EIP-4973 Account-bound tokens are the basis for expressing consensual agreements on-chain. Their lack of permissions in minting makes them ideal to be applied broadly and in a true peer to peer fashion.
Tim
]]>Additionally, I think it'd also mean that we'd save tons of money on cloud server costs, running infrastructure, and we could also maintain a much smaller team.
In fact, when I tell others that we're "reinventing" parts of the graph protocol, they almost sound offended and start asking very skeptical questions as to why we've decided to go down that path. So please let me outline some reasons why we've tried to steer clear of any hard dependencies for neume other than an L1 node, namely Ethereum mainnet.
Many software engineers will immediately point to a key fallacy in our domain, namely that of the "Not invented here" syndrome - what I almost make out to be a conspiracy theory of Saas companies secretly plotting their way into developer culture.
It states that teams tend to "chauvinistically" avoid buying products or services with external origins. It is usually a pejoratively used term by those who favor externally originated services.
And while there are a million arguments to be made about the improved scalability, modularity, separation of concerns, and economies of scale FOR using externally originating services, in this post, my goal is to make it abundantly clear that I see it as critical to almost entirely steer clear from these purposefully addictively solutions as much as possible in the neume case. And I want to say: You may not be able to fully comprehend this argument if you've never been in my shoes and proverbially "sat downstream of a service provider." But let me try to explain my case anyways.
There's a reason why we use the term "adoption" when we're talking about adding new dependencies to our software. Speech-manipulative, I've actually tried using the term "adoption" over "addition" in this context as it more precisely carries the notion of responsibility that comes with additional dependencies. "Addition" may suggest that "subtraction" is commutatively possible (read: "easily"). But that's not precisely what a react.js developer would tell you about their front-end stack and the respective lock-in. They'd say that subtracting isn't the inverse of addition, and so while generally "integrating react" may be reversible - it's not as easy as "removing all react.js specific code previously added."
And so, rather, I think the term "adoption" is more appropriate here as adding a dependency then also means "taking care of it in the future." And for react.js, that means also following their kind-of-crazy roadmap towards ever-ongoing front-end "innovation." It also means being "OK," with Facebook pivoting to "Meta." And that although you initially had an idea of where the project may be heading - that over time, you stop having this sound understanding and that your supposed change sets towards adjusting the project to your use case may also be rejected upstream.
So adopting a dependency is anything but straightforward, and I'd actually argue that it can kill or strengthen your software product immensely. And let me be clear: this is not some theoretical example that happens on the fringes of the react.js roadmap: We have the famous "pejorative" fallacy in our software developer language because some are vehemently pro while others aren't. And it is also where I'd like to point out Charlie Munger's fat-tony quote, namely:
Show Me the Incentive, and I'll Show You the Outcome
It's those that argue for the negativity of "Not invented here" that have something to sell - while it is those that see it skeptically who have been burned.
And this leads me to a simpler but first-principle-based conclusion of this dilemma. Yes, on one side, we're potentially wasting time "reinventing the wheel," and "absolutely," we'd be faster and more cash efficient by "just" using turn-key solutions to build neume.
But, on the other hand, at least personally in my career, I've come to observe the immense power of having control over the software and how it's being run. Control is power, control is money, and retaining control is compounding future power and cash flows. And although these are just a few words in sequential order, they couldn't be more significant in the context of open source or, generally, software development.
Let me ask you: Have you recently seen Apple's announcement to add consent tracker warnings in front of all app dialogs that mine a user's data? Have you seen what that did to Facebook's stock price? And do you realize that this may lead to Facebook's unheard-of pivot into the metaverse? Yes, that's right - control is everything: Billions of dollars if you want. And for Facebook and its bullies (the operators of Android and iOS), this is just the end of a year-long fight in which Facebook also attempted to launch its own phones for control and so on.
But my anecdote on this topic is maybe a bit different, and it has to do with my previous project rugpullindex.com, where the problem was of similar nature: And it's simply that sitting downstream of a busy upstream river "pollutes the water unreasonably" that it may end up affecting the product.
So much so that you may need to "begging for upstream changes" or that you're staling to innovate given properties like "water quality," quantity, or other unforeseen diplomatic issues. It can lead to a project's failure or the necessity to take the entire infrastructure, rearchitect it and start from scratch elsewhere.
When I started neume, it was very much rugpullindex's crawler 2.0. A re-architecture away from the graph protocol, steering clear of implementing platform-specific services, only hard-coding the Ethereum node, and also not using any fragile SaaS offers "to speed up and simplify" things. I had to learn the hard way: By bug reports waking me on Sunday mornings, by endless discussions and frustration with upstream maintainers - by getting burned out.
With the neume network, I'm confidently giving away these key lessons within its architecture. And I'm also committing myself to not repeat my past mistakes by becoming addicted to software as a service middlemen.
We've built neume with composability in mind. It's free software, and technically it doesn't really "run anywhere." It's credible neutrally constructed, and I'm holding myself accountable for the relationship that it maintains with everything using it. Neume should be like Linux: You may regret using it because it won't function well all the time - but you should never regret adopting it because you find out later that its maintainers are playing a power game for the bucks in your wallet.
And that's why we're steering clear of any major dependencies but Ethereum mainnet or other potential L1s. It's because this extractive property of software services wanting to monetize is potentially transitive, meaning that if we were to integrate, e.g., the graph protocol, and they'd be "turning on monetization," then that'd affect us too and we want to avoid that.
Tim
]]>In the process, one of them had taught me the basics of xargs
- or at least they were trying. And so, using curl
, we sequenced through the natural numbers API we had once architected - hoping to rescue what remained.
I don't know if it was the beer or the egg salad - but something made us feel triumphant, although it would turn out that we had failed.
I mean, our efforts were not totally without effect, and so we were indeed able to back up some of the data. But in the grand scheme of things, we failed - because we didn't back up the artists' manifestations and because, years later, the metadata turned out to be useless.
This, my friends, is the story of a failed startup and how some colleagues, now friends, tried to back up digital art. It's the story of ascribe and "ascribed." And it's not fiction - What you just read is an actual account of a night we spent trying to back up data from an S3 bucket. Sadly though, we weren't diligent enough, and so when public access was revoked - almost all ascribe metadata but more importantly, all digital art files ended up being deleted.
But let's backtrack a second: In 2015, I started working as a front-end developer for a company called ascribe and what we did was registering digital art on the Bitcoin blockchain. Bitcoin because Ethereum had simply not been around yet. Digital art? Yes, really like a precursor to the modern-day NFT.
It were the early days, and browser plugins like Metamask didn't exist yet - but our idea was not uncensorable media. Instead, it was to decentralize digital art's ownership layer and make it ownable like you could own a Bitcoin. And we ended up doing an OK job on that as the SPOOL protocol is still accessible through the blockchain and so technically, ownership can still be tracked and viewed.
But then, in 2016, the early-stage startup ended up getting into funding issues as back then, having daily active web3 users wasn't a criteria VCs valued. And so this meant that we pivoted the company to build a scalable blockchain database called BigchainDB instead - ascribe as a product development focus was cast aside but remained online for quite some time post-pivot.
Still, and this is somewhat reasonable, when refocusing its attention, BigchainDB Limited eventually decided to shut down ascribe, and unfortunately, they took down most of the art too.
So that story at the beginning of this moonless night, that's from a moment shortly before ascribe's official shutdown date when, I think, I wasn't even working there anymore, but we still felt compelled to save what the artists had uploaded.
Today, I felt motivated to share this story publicly as I think it needs more recognition from those contributing to the NFT space. When NFTs popped in 2021, some of the OG's narration was on restoring ascribe and its pre-historical art. I and others were asked: What was ascribe? What were the artworks? How can I access them, and are there any archival or restoration projects?
The sad story is: We didn't have answers. In fact, we tried getting together on GitHub, and we did some reverse engineering on the open-sourced backend. With a restoration project we founded called "ascribed," we thought: Maybe it's possible to re-instantiate ascribe as an NFT-native platform by migrating the SPOOL protocol stored on Bitcoin over to Ethereum.
But my frank assessment is that it's going to be close to impossible, and potentially, that I'm now also OK with that. It's because I'm seeing some other good projects embodying the spirits and visions of ascribe - of making digital art permanent, to archive it, and to fundamentally alegalize it by using crypto-native protocols.
Today, seven years later, ascribe is an integral part of the space's story and a stark reminder that, contrary to what our parents kept telling us, namely to cautiously use the internet as it never forgets - the internet actually has forgotten a lot of important stuff, for example, the ascribe library.
So I want to take this moment and motivate two critical aspects of digital art and media platforms that we should strive to implement: It's (1) permanence and (2) alegality.
Permanence is the temporarily independent and content-integer accessability of content. IPFS implements permanence through content-addressability, hence creating integrity through file hashing and by making everyone capable of replicating the said file. The Internet Archive makes past websites available through the Wayback Machine, creating a permanent web of all versions throughout time. Bitcoin and Ethereum keep all ledger data available to all network peers so as to enable peers to recompute the entire accounting history and preserve the network's integrity.
Alegality is a subtractive measure to decrease reliance and dependency on institutional law. Alegality means putting a minor emphasis on legality - not in opposition to legality (as this would be "illegal") but rather as to conceptually solve legal problems without the law being a relevant aspect of the solution. Bitcoin solves the double spending problem by building an uncensorable permissionless consensus network that never requires the intervention of legal recourse. TheDAO's white hat hackers settled the recourse of the then-defunct smart contract. And NFTs solve the problem of attribution and monetization of art through on-chain registration financialization and not through copyright defense.
Permanence and alegality are some tools of Jaya Klara Brekke's hacker engineer towards creating autonomous digital policy. The story of ascribe shows us the importance of autonomy, alegality, and permanence - as had we treasured those aspects, then more of its art would be available today.
Tools of the hacker engineer are to be applied with care but are vitally important to understand, practice, and implement to preserve on-chain music, art, and culture.
Tim
]]>The goal of neume should be that it can hit “escape velocity”, defined as:
All three of the above criteria strongly relate towards pushing onwards from the incubation that HIFI Labs has generously provided to neume over the past year.
Successful transition to escape velocity will see self-perpetuating growth of the project, building on the social scalability foundations that we have already laid.
To date, neume has been solely financially supported by HIFI Labs. The next stage of neume will be built upon more varied funding.
To this end, despite great progress over the past couple of years in the development of DAO methodology and infrastructure (e.g. Nouns Builder), I personally don’t think that this is the right model for neume at this time.
While there is clearly a lot of cross-over between DAOs and Open Source projects, my view is that a project like neume shouldn’t be “owned” by any distinct parties, and funding through some form of token creates divergent incentives. The values and social contract that we started neume under; social scalability, credible neutrality, open infrastructure; require the project to be “unowned”, permissionless, and free (as in libre).
It would probably be the easiest route to spin up some kind of token, but the medium term implications are severe if motivation turns to token value, which is ultimately inevitable.
So, if we discount tokens as an option, then we can focus on the following routes towards financial independence…
Grants from independent bodies (such as gitcoin) or aligned projects are a likely source for some funding that neume should pursue. They can provide supplementary funding free of restrictions.
However, there are challenges with the grants approach, especially those applied for via an independent programme. There is usually significant front-loaded effort, writing bids and cutting through the noise of other applications. Funding can often be delayed, and require matched finance.
For these reasons, I think that the optimal approach for neume in the short to medium term is to engage with larger projects in the music & blockchain and web3 open source development communities, and seek support.
Similar to grants, though even more unrestricted, donations are a viable option for supplementing neume’s financial independence, albeit unlikely to provide the foundation for funding compared to other routes.
Not specifically a route towards financial independence, but a very important accelerant towards escape velocity and beyond for neume will come through development contribution provided by affiliated projects.
The purpose of open infrastructure and composability is to alleviate the need for parallel development within organisations, a core issue in traditional industries. And thus, our aim at neume should be for it to be attractive for projects to develop within it, rather than in parallel to it.
A more adventurous, but ultimately sustainable, route towards financial independence for neume will be to start a service company alongside neume that funds the project out of revenue that it generates.
Examples of services that could be built: platform integrations, NFT database caching and provision, rights and authority database, NFT accuracy insurance, etc.
In a well functioning ecosystem, a not-for-profit service company run by neume would be in competition with other companies for the services that it provides, and ultimately the majority of which would become commoditised into the network over the time.
And thus, this funding route can essentially be seen as building a business around elements that are not yet ready to be ingested and open-sourced by the protocol, and then continuously developing the product line as previous elements are commoditised.
neume’s development roadmap to date has been solely driven by HIFI Lab’s requirements for musicOS.
This has been extremely useful to focus our prioritisation while establishing the project over the first year, but reaching escape velocity requires a more varied set of use cases and needs to form the network into something that is useful and used by the entire ecosystem.
Development requirements will come from two key sources - projects that are building music NFT distribution platforms, and projects that are building music NFT experiences.
neume’s priority to date has been to focus on the few, largest, music NFT distribution platforms, Sound and Catalog (and Mint Songs, and Noizd), which provided the highest value for development effort while establishing the architecture. Moving forwards, we are looking towards the Lens and Decent ecosystems as the potential next additions, as well as “custom contracts”, for example those released through factory contracts from Zora and Manifold.
However, I am aware that this prioritisation remains driven by myself, and HIFI Labs, and may not in fact reflect the needs of the ecosystem at large.
Bringing projects on to neume will not be an overnight victory, and should be seen as a constant endeavour, though it is hoped that there will at some point be a break point where it becomes the de facto infrastructure to build on, and ship to. We should do everything that we can to make neume this.
A product of the previous escape velocity conditions, but likely the most important of all. neume will be successful once projects (users) and developers using and shipping to it are growing at a steady and then increasing rate.
There are probably reward mechanisms that we can build to encourage this behaviour, however, as with the comment above about tokens, I am personally cautious of disguising genuine active users for those more financially minded.
This area is something that we should develop going forwards. As it stands, we have github commits and open office attendance as measurable indicators, but we should think about more comprehensive metrics to understand how active the project is becoming.
A final section to discuss, though not directly a condition of escape velocity, is the management of neume the project itself.
As thing stand I am somewhat the default director of proceedings, but as the above elements grow we may need to think about more democratic approaches.
With that said, my personal approach to governance is of simplicity, only where requirement, and subtraction rather than addition. One of my frustrations with recent developments within DAOs has been the over-complication of decision making (usually to justify a “governance token”).
For now, I would propose that we continue management of the project as it is, but remain alert to where we may need to alter things due to there being friction or deviation from our values.
A specific element that is worth highlighting though, in context especially to the funding options above, is that some will likely require the establishment of a treasury, and the accompanying legal and financial requirements that come with that.
For this, I suggest that we would likely want to establish a “neume LLC” (or equivalent), which could then be the vehicle for the “not-for-profit services company” detailed above.
Dan
]]>This project update represents a move towards more regular and higher quality external communication. There's a lot going on and it's important that we celebrate the wins as well as make it as easy as possible to stay up to date with the project.
As I stated in my last post through this blog, Taking neume to escape velocity, we have three key goals over the next few months:
The opportunity to build into the Lens ecosystem hits across all three objectives, supported by a very kind grant.
Lens offers specific questions, not less in that it challenges some of our previous architecture decisions - the use of "publications" (and "pubID" identifiers) vs. our current build of building around "tokenID"s for example.
The first month of our Lens engagement will focus predominantly on generalising the neume architecture, moving into the development and implementation of Lens specific strategies to also focus the crawler towards Polygon and Lens contracts.
Additionally, we are thinking hard at the moment around indexing "curation" and "agreements", with Lens offering the perfect environment in which to explore some ideas.
We are currently discussing the Lens engagement in depth in a github discussion here.
As alluded to above, we decided last month that we had hit the limit in what discord could provide the community in terms of effective communication and sought to find a more suitable alternative.
After much research, looking at discourse and other options, we decided as a group that github discussions hits the mark in terms of well structure topics, open-ness, and closeness to the code.
We are in the process of fully moving over to this new home for communication. If you have any questions, thoughts, feelings, emotions, or otherwise, then please come and join us here!
As part of this move we have also moved our weekly open office to google meet, which is open to anyone to join. Add to your calendar.
Finally, I know that these updates are a useful addition to interested parties that don't have the time to engage on a day-to-day, so I will be utilising the blog a lot more going forwards. I promise.
The community building neume continues to grow, both from a shipping and discussion perspective. A key highlight from our last few open office sessions being the engagement of a number of guys from the Public Assembly community, special shout out to KD!
The beauty of open source projects is that it is not only the code that is open and composable, but the people surrounding it also. I find the pursuit of common edges personally very exciting and can't wait to see how this develops going forwards.
To facilitate a group group of active participants, we decided as a group to expand the project maintainers, promoting il3ven and neatonk to join myself and reimertz as stewards of the project.
At the very core of the values that we are building neume towards is the belief that the best open source projects are open to anyone to join, at whatever level they wish, and to pursue whatever interests them. The net summation of our efforts defines the overall output of neume. So, please join us. We want to hear your thoughts.
As mentioned, in this blog and the one previously, financial independence for neume remains my number one priority. Aka, it's grant szn bby.
I am thinking through greater plans of public goods funding for the wider onchain music ecosystem, and so if anyone else is interested in this topic then please hit me up.
In the meantime though, our short-term focus will be on augmenting the grant that we have received for building into the Lens ecosystem with putting together a strong application for the incoming Gitcoin "Beta" round.
Gitcoin rounds operate on "quadratic" match funding, and so we will be putting together a marketing campaign for neume alongside the bid to build as much interest as we can. Please keep an eye out for this and support us wherever you can.
Dan
]]>May has been spent predominantly on building out indexing capacity for Lens. We have now completed one third of the three month grant that we were offered and done a lot of thinking required to deal with the requirements of the Lens ecosystem.
In addition to this, with an eye on the future, we have been exploring the “network” aspect of “neume network”. At this stage we are thinking through how users could use neume to index for specific queries (rather than having to crawl for the whole dataset). This is likely going to be a key element of research and development going forwards, with some exciting work going towards how different neume nodes should communicate and sync with each other.
Finally, we also ran a very success gitcoin round, in the Open Source Software category. At the time of writing the matching calculations still need to be ratified, but it appears that we will be in line for c.$600 in addition to the c.$2,000 raised from donations. Thank you so much to everyone that contributed. We have an open discussion in our github forum right now to consider where the highest value use of those funds will be, please feel free to contribute to this discussion!
We have made good progress on this stream, writing strategies for Lens and testing how this work with the current neume architecture. This has resulted in an updated schema, as defined within this PR which has been merged into the codebase.
Inevitably we have met some expected challenges with regards to scale, there are of course many more records coming through from Lens than the other contracts that we have been previously indexing. Which has triggered an interesting discussion within the community around the best way to focus the indexer and filter results.
The discussion has culminated in three potential options:
Our estimate is that there isn't much difference between 2nd and 3rd with respect to speed. The 3rd option is better because it will cover more area and opens up the potential for future collaborations with front ends.
So, we will start from the 3rd option and move towards 1st if needed.
After much discussion about the right database structure to output crawled data to, we have moved neume from levelDB towards SQL, as can be seen in this merged PR.
In addition to this, we have started to look forwards towards the next version of neume where by we want to make it easy for new users to be able to run a node in an efficient and effect way (see discussion here). This is still an area of active research as we progress neume towards its future network state.
Starting out with the goal described, this stream has expanded into a neuIP to create a more generalised methodology for neume users to be able to index only for the data that they require (rather than everything).
The goal of this month's work was to conduct research in this area and then propose a neuIP, which can be seen here: neuIP-5.
Next steps for this workstream are likely to include:
A specific outcome will be an updated schema and reference implementation which can be used to generate subset schema IDs and derive subsets via constraints. This would also aid discussion by making the proposal more concrete.
An additional area that we are keen to add to the neuIP-5 discussion are ways to link datums across related schemas, like tracks and NFTs. Alternatively, related datums can be embedded to simplify access. Ideally we could support both use cases and let node operators choose the trade-offs that make sense for their use case.
Lens grant
Gitcoin grant