2013/05/21

Magma Rants 1: Introduction to the Language

I've been doing a bit of a thought experiment recently, around the idea of programming languages. I imagine the vast majority of programmers do this, so I don't think I'm special here, and while I would love for this idea to go somewhere, the pressing need for a sustainable consistent influx of cash is more important than solving big problems and fixing the world.

I have a manifesto writeup going on on a Google Doc, found here. I regularly add sections over time, the real pain is when I redo parts of the spec because I don't like the way something is petering out. Here are some examples of that:


  • Initially, I liked the idea of using colons as assignment, such that in constructors you would use foo : 5, in variables you would use float bar : 3.5, etc. I gave up on this, for two reasons - one, even though the precedent of using an equals sign for assignment isn't really mathematically accurate (and I really liked the idea of not needing the terrible == for equality) some operations like +: and -: just look ugly with colons attached. Blame my great aesthetic design mindset. This also opens up the colon to be another dedicated glyph, and let me separate property access (.) from scope (:) which I prefer.
The purpose of Magma is to solve a problem - which I hope most languages aim to accomplish - by accepting a reality of native language design. You want a kitchen sink that you can write firmware or a kernel in, but want to actually build a working project with it. I think my approach is a step towards solving this problem, in that the Magma specification outlines multiple effective compiler specifications, with different error conditions and warnings depending on the chosen context. Contexts are compiler flags specifying what kind of binary you are constructing, and the compiler builds each library and executable in its own context, which is an optional header in its metadata specification (depending of course on if you want to compile this to say, LLVM, or some newer binary format for a new architecture). The default context is the application context, but the language specification has multiple contexts:
  • system - allows raw bitwise, raw shifts, raw pointer manipulation, jumps, and pointer assignment to integers and casting between the various integers (pointers, ints, fixed width chars) won't produce compiler warnings or errors. You can also use asm: blocks to write inline assembly. This state also outright disables exception handling and any bounds safety checks in standard library classes. This context is meant for kernels and firmware, and should be used sparingly. Even a kernel proper should have most of its libraries written in another context. It is also a good idea to isolate system context code in its own libraries or binaries independent of the bulk of a main application, as a sort of "danger zone".
  • lib - compiles libraries instead of executable binaries, with no main function. By default, uses the app context, and aliases applib. You can create libraries in other contexts by just suffixing the name with lib, such as systemlib.
  • app - The full standard library is available, but you can't use raw shifts, jumps, pointers (use refs) or std:bit (bool and std:flags will still use bitwise internally just fine). 
  • web - Targets the Fissure intermediary language, enabling binary web applications. The full ramifications of this context need to be ironed out through trial and error - it has no file system access, no access to the networking stack besides the convenience http send receive layer, etc.
Besides compiling binaries in various contexts, for security purposes Magma apps need ot specify (in their metadata, or embedded) the various std parts they use and what system resources they access (files, network, contacts, accounts, 3d video, audio, etc) so they can request permissions in a mandatory access control environment seamlessly.

The broad objective is to recognize 5 things:
  1. Programmers like familiar syntax. If you can get one syntax up an entire stack of languages for various purposes, (which Magma / Fissure / Stratos are intended to be) you can significantly streamline the time investment for new developers to pick up the entire technology paradigm.
  2. People like readable code. Magma is not only tries to achieve minimal glyphic overhead and readable code, it can be written whitespace significant or not (using traditional curly braces and semicolons) to enable choice.
  3. People like choice. Choice of paradigm, choice of library, and contexts enable a choice of warning states.
  4. Times are changing. Heterogeneous computing is going to be huge and massively important, and no modern language is going to have an easy time tacking on easy to use SIMD functionality the way Magma will with std:simd functions, parallel profiling in the compilation stage, etc.
  5. Build files suck. Qmake, CMake etc - Scons is really neat, but Python (lacking contexts) is hard to get into a nice syntax for compilation instructions. Enter the stratos build context, which is the dialect of Stratos used to write .stob files to build magma binaries. I mean, build systems collectively blow, and none of the modern languages (Go, Rust, etc all have no immediate solution). Make syntax is completely alien to imperative languages (just like shell is completely alien too) and the raw effort to learn them all is absurd and unacceptable for new developers in the coming years, at least to me.
I think we can do better than what we have, and go beyond C++, D, and even Go and Rust, and really recognize the need for a native language you can write anything in and make it simple. Contexts I like to think make this a lot easier than trying to kitchen sink everything into one compiler state and hoping people don't break stuff with preemptive optimizations into inline assembler.

The point of this blog series isn't to write the manfiesto, but to brainstorm aspects of the langauge ideas I have by writing them down.

2013/05/12

Game Rants 3: Neverwinter: Part 2: The Mid Game

The mid game starts in Blacklake and ends in the Northdark. It spans from level 5 to 60, which means it is the super-majority of leveling content. Ends up, that is most of what it is, and that is what I'm touching on here.


Part 2: The Mid Game
Pros:
  • There are options to leveling - foundries, quests, dungeons, skirmishes, and pvp all award xp, so you can level any way you want (in theory).
  • Zones have directed stories - you start near the entrance, progress through quest hubs, and eventually reach a dungeon at the end. Every zone has a corresponding big bad in a dungeon to kill, and all these dungeons have epic level 60 versions too.
  • Power's are obtained quickly early on and peter out in higher levels to 2 - 3 new abilities every 10 levels with the bulk of powers early on. This means you get the majority of your ability choices quickly, and have actual skill choices early on.
  • Most abilities are useful for something, at least on the TR, CR, and Cleric from my experience. There are few stand out "garbage" powers that are useless, but few and far between. In most circumstances, the powers you run are situational, which is good for the choice paradigm.
  • Quest dialog and most mobs have voice overs, which help immersion.
  • The graphics are great, and the game runs exceptionally well on Arch (which is where I'm running it from).
  • Music is great.
  • There really isn't much pressure in the pay to win direction while leveling. The level process is really fast so XP boosts are not necessary, and you can easily hit max level just doing the leveling quest content. You can't usually afford a mount as soon as you hit level 20, but you almost always can by 23, and the uselessness of gold for the most part makes it easy to justify the purchase. Bag space is the biggest offender, but you do get 2 bags (one at 10 and one at 30) that somewhat offset the bag pains, and proper inventory management lets you do entire quest hubs, visit the vendor, and repeat ad nauseam without really feeling a space crunch.
  • PVP auto levels you to the X9th level in a bracket, so there is no level imbalance. Ability and gear imbalance can contribute somewhat to the experience, but each battle gives half a level and you can get max level pretty quickly through it, so it is pretty balanced leveling pvp. I feel like they learned well from both the failings of the no-balancing WoW and the everyone-to-max level TOR.
  • Beyond the visuals, the environments are amazing. Being in the crater of a volcano, or climbing a massive Ice Giant's pick, or battling Gray Wolves beneath a giant flaming Wolf carved mountain all present epic landscapes. The look is great and really conveys strong atmosphere. A real standout was the Chasm, which progressively got more corrupted the deeper you went.
Cons:
  • There are very few consistent characters and no overarching story arc in the leveling content. It means there is very little engagement with any zones story because you know the characters are fleeting.
  • You don't change the world through your actions. You go places, kill monsters, they will still be there, Helm's Hold is still under demon control, Icespire is still covered in giants, etc - there is no real phasing, so the game doesn't change as you accomplish things, cheapening the engagement even more.
  • Quests are rarely dropped, and always require running back to town. This is offset by having multiple progressive objectives in a single quest, but the need to go back to the Protector's Enclave to turn in a completion quest after each zone accentuates this issue - even though you do go back for a reason (usually when you finish a zone, there is another one to progress to).
  • While the zones looked awesome, very few environments were manipulable or changed during progression. It was a very static world - besides the mobs standing around waiting to fight, the quest NPCs idling in camps, the world itself is fairly unchanging.
  • There is a lot of "free" content here - the leveling process is still drug out, and these zones were all complex works of art that took a lot of effort, but they still reek of unnecessary.
  • With the foundry nerf, the only ways to level now are quests and pvp. Dungeons and skirmishes give awful xp per run and for the time commitment, but the gear you get is so fleeting they often aren't worth doing outside the fun of seeing them the first time (which is really fun!). Foundries are almost never worth doing anymore, ever, which I feel hurts one of the games best aspects.
  • When leveling up, powers have very nebulous descriptions, and a huge component of how viable an ability is is its animation - how long it is, where it goes, etc. You can't figure this information out yourself, and with respecs costing real money, there is no easy way besides lots of out of game research to figure out how to distribute your limited power points.
  • Dungeon roles are questionable. Guardians aren't really needed because most boss mobs spawn tons of adds that a guardian can't aoe tank, barely do any direct damage, and often swing so slowly anyone else can dodge them. Clerics are absolutely mandatory for almost anything past the Crypts, but the dungeon queue system will stick 5 dps in a doomed group instead of requiring a cleric (at least).
  • Aggro is very broken. For the most part, it is a combination of distance to target and damage done, but guardians can't outaggro healing aggro, which seems to apply from any distance. This means most mobs can't be pulled off a healing cleric, which makes combat a one dimensional kite rather than a coordinated utilization of classes filling roles.
  • This might be a bug moreso than a hard negative, but the group travel mechanics are very annoying. You can't transport between solo player zones and dungeons in a group and it forces you to wait for your party to go almost anywhere. I feel like the entire mechanic is a pointless holdover from D&D proper, and letting people zone in places wouldn't hurt.
The biggest issue with the mid game is the needlessness of it all - the story is too static to be deeply engrossing besides a few rare characters like the lovers that show up in both the Plague Tower and deep Chasm. Because your actions don't have a lasting impact on the world, and the overarching story of catching Valindra takes a backseat after the tutorial and rarely pops up even in passing except in engagements in the city or in the Ebon Downs, the plot is all over the place and leaves players wanting.

This contributes to a greater sense that way too much development time and effort was put into this leveling content - from voiced over quests, to well realized zones, to all the different monster models and animations, a lot of this seems like a poor allocation of resources when launching a f2p MMO - people will level once, experience this content once, maybe twice if they level the single alt the game lets you roll without paying money, and then they expect a repeatable end game to keep them playing.

And f2p depends as much as any game on their persistent players to bring in the new players to spend money and to consistently buy new trinkets as they enter the store. Your hardcore audience is your best money sink, but you don't win them over with a lot of well designed questing zones, because they do those once and never come back.

I feel like the game would have done significantly better on a slashed budget with level 20 as the cap, with you getting 3 power and feat points per level, than having the level 60 cap, all these excessive leveling zones (and let us be honest, the Plague Tower quest chain would have been a great point to finish up at a level cap, and then just add a few levels at a time as new zones are introduced, and have early access to these zones for a few days - of course, deleveling players that go in them and get higher levels so they don't get a power advantage in PVE or PVP until after everyone can enter them). All these zones are just massive developer sinks that absolutely took lots of development time and will produce very little return on investment both in player time engaged in them and in income as a result of them. Like I said earler in this post, there is little incentive to spend money while leveling (the lockboxes I feel are a good exception, and while even the box keys are radically overpriced, a $1.50 a key is much more reasonable for most players to shill out at a whim).

This is even more pronounced at a neutered end game - and, in praise of Cryptic, the leveling content is not short but it isn't excessively long either. It might wane on some players but it won't on most, and the ~60 hours to hit max level finishing the quest content (which many players might not even touch while pvping) is acceptable. My arguments against the quantity of it is directed towards Cryptics bottom line - they spent a lot of time making these beautiful zones with forgettable one off plot threads that few people will be paying money to experience, since they gave it away for free. Kind of like how TOR gave away the best part of that game by making it f2p.

Another issue is the routes to 60 - any group content gives awful XP per run (I feel like completing a dungeon should give at least an entire level in XP, and a skirmish at least half for the time commitment and awful XP returns just killing mobs) but doesn't give enough gear per run or time to justify doing them between quests. If you follow the actual progression the devs laid out in terms of questing content, which is to run a zone, reach the end, do a dungeon, and then move on to another zone, you will outlevel zones in no time and end up being unable to queue for dungeons whose quests at the end of some zones you just reached. Nerfing XP gains isn't an effective tactic either because some people just want to get to end game and making them grind there, even with solid quest content, isn't making them happy paying customers any time sooner.

The foundries were fun and interesting apart from the normal quest content. They would mix a lot less fighting or a lot harder solo content into an otherwise monotonous zoned questing experience, and if not for the exploits around knocking mobs off platforms or farming ogres, they would have been a great complement to the leveling experience. In the next part, I'll go into why the foundries are now completely useless besides the fun they provide (and remember, games are about having fun - and I can't forget to mention I only write this much crap because Neverwinter is, at the end of the day, quite fun. Flawed, which is I what I'm getting to, but still fun).

2013/05/11

Game Rants 3: Neverwinter: Part 1: Intro & The Early Game

This will  be a 3 part series on the new MMO by Cryptic, Neverwinter. It entered "open beta" about 2 weeks ago, and I have been playing it with friends pretty much non stop ever since, until a few days ago at max level. This blog arc will be a story of 3 parts, I will discuss the "phases" of the MMO (the early beginning game, the mid game, and the end game). I will speak of what I like, dislike, what could be fixed, and what should be learned as a lesson for future MMOs.

Part 1: The Early Game
Pros:


  • Excellent character customization, including an excellent cascade of customization complexity - you could go with a wide variety of presets, or tweak everything yourself.
  • Excellent race selection, and excellent monetization scheme of having paid-for race choices (at launch, drow is locked for a month except for founders).
  • Excellent initial class distribution, with a few questionable choices.
  • Excellent CGI intro, defining the conflict and setting well.
  • The dynamic patching is excellent, enabling players to obtain game assets when needed rather than in raw bulk patches, but the game keeps it an option in the launcher to update immediately everything, which I did overnight after a few days playing and tiring of the ~3 minutes of file patching per zone that was necessary. Excellent tech.
  • The opening zone (and the entire game) is voice acted, adding to the atmosphere.
  • Visuals are wonderful for an MMO, animations are usually good, soundtrack is excellent.
  • Gameplay is good for an action mmo, the best yet in my opinion - animations lock your character in place to add weight, and you have 3 discretized classes of spells - at will (spammable), encounter (short cd), and daily (once per fight usually, require obtaining action points by doing your job to use).
  • The intro zone has a dedicated story of getting into the city, introduces the main villain, and shows off the most important game mechanics.
  • Visuals on the bridge (mortar fire, fire in general, crumbling expanses) are great.
  • The dracolich corpse at the start is a direct tie in to the intro cinematic and is a really good story builder.
  • Tutorials are informative - abilities are flashy as you level up and skills appear on your bars, tips tell you the specifics well.
  • You are able to write your own story for when people inspect your character.
  • Character titles, including two right off the bat from your god and home.
  • Moderately good server stability for an MMO launch.
Cons:
  • The characters introduced in the cinematic are never touched on again, and the game seriously (in the long run) lacks persistent characters besides Jonas and his wife throughout the spellplague storyline. This starts here with your companion on the bridge, who dies. Barely any other characters are persistent (and the Sargent you constantly report back to in Protector's Enclave is barely characterized after the start).
  • The ability scores are really stupid to have in character creation - they aren't actually rolled, there are just ~8 presets of score rolls to pick from. While mousing over them tells you what is "good" for your class, you have no idea what really is the best for you, and any new player is going to be completely lost on what to pick here. Ability scores also matter a lot. The consequence is that a major mechanism of the player is defined off the bat, remains unchangeable forever, and can cripple a character before its even made.
  • Abilities are slow to come by - it takes a while to unlock all of your (outrageous!) 8 ability binds (counting the tab class ability). It makes combat really lackluster at the start since you really are only spamming one at will and one encounter and doing nothing else.
  • The paragon system in conjunction with the "specialized" classes like great weapon fighter and trickster rogue rather than just fighters and rogues means down the road the game is going to either focus on providing classes with paragon paths (good) or adding excessive classes (bad).
Overall, the beginning is a resounding positive. The combat is sound, the graphics and audio are good, nothing is glaringly wrong - in most bad MMOs, right off the bat bad textures or broken models or crappy combat or lag would hint at flawed games off the bat, but that doesn't happen here. The game has some polish.

My primary criticism of the introduction is a lack of engrossing story - you land on a beach with a hand waived boat crash, you progress through a battlefield doing things for random people, you find a guard who goes with you across a bridge, you find the necromancer from the cinematic, fight a hulk, finish up, and never touch on any of this again. It is a pervasive issue in MMOs that the stories rarely carry weight - the effects of the intro only lead you to the Enclave in the city, and while NPCs will remark you were the one who fought on the bridge, it quickly evaporates and the impact on the story is negligible.

Also, the presence of prerolled sets of ability scores to choose from in my opinion is an unnecessary hold over from D&D proper - yes, ability scores are essential in D&D, but making every class have a forced preset of ability scores (plus a forced level set allocation of scores) would help mitigate a newbie from putting all their points in intellect as a fighter (or taking a sub-optimal initial score roll at character creation). Considering there is no way to re-roll your ability scores even with zen, it becomes an early game core mechanic that can make or break your character.

The issue with class distribution is two fold - for one, your ability to tailor your character to your job is limited if they release classes as specialized roles in combat rather than paragon paths. I've argued in the past why providing few classes with discrete role customization that isn't on the fly changeable like a wow spec is the best way to make each character unique but keep everyone capable of doing a useful job.

In Neverwinter, this is the principle reason there are so few guardian fighters - because they are pure tanks, and no matter what their paragon paths are they will still be a sword and shield fighter, they don't give off an epic appearance like the dps roles, cleric, or gwf. If there were a single fighter, where one paragon path was great weapon fighting and the other was guardian tanking, the shortages of both classes would almost certainly be less severe because more people would have rolled a class that had options in their role.

It seems apparent the reasons for the decision - paragons are about a few powers, not class function in combat. The classes are defined by the weapons they use, and the various styles of using any given weapon set falls into a paragon path (I anticipate at least, given that every class only has one paragon path at launch). But like I said, this means role customization is lacking, and it means everyone is pigeonholed into a single function, and while the cost of leveling alts isn't nearly as bad as it is in other games, the lack of any heirloom-esque items to speed up the process of a reroll (which since the foundry nerf means you are going to be repeating the same content over again each time) which isn't very fun for anyone.

But overall the early game is great. It is probably why the MMO is popular right now - you do get a good impression besides the blemishes. In part 2, I'll be harsher, have no fear.

2013/04/10

Software Rants 12: Cryptocurrency

So Bitcoin tanked today for a while after Mt. Gox got DDoSed. It came back, and is still sitting at the insane $200 valuation it has held for a few days. And it was $30 in January.

Long term, though, Bitcoin has one shortcoming that makes it both a pyramid scheme and invalid as a currency. At least in the long term.

By restricting the monetary supply over time, and decreasing the rate of coin creation, it not only means early adopters get a disproportionate amount of the monetary base in concentration, it means that every 4 years when the coin generation halves, it deflates the currency more and more. After 2140, the supply of Bitcoin will only ever decrease over time, as wallets are lost and the coins contained in such wallets rendered unusable forever.

This is catastrophically bad for any currency. When the supply shrinks over time, it means that the currency naturally becomes scarcer, so its valuation will deflate no matter the economic climate it exists in. In the future timeline where Bitcoin takes over, the holders of the majority of currency reserves (be they banks, exchanges, or super-wealthy citizenry) have no reason to ever spend money or invest since their money gains value naturally. Of course, they would probably still invest - but if the risk climate now is awful when we have tremendous inflation on the horizon of USD, it would take an absurdly safe and obvious investment to make a future bitcoin billionaire do anything with their money but sit on it, take it out of the money supply, and reap deflation.

It means those who have money gain value without doing anything. Just like how right now, having money means losing value because more and more money enters the system, as money leaves the system in the deflationary economy the holders of the scarcer and scarcer resource only need sit on their reserves to have more real valuation. That is disastrous for economies if vast swathes of the monetary base are out of circulation being used as an investment.

It is what is happening right now with bitcoin, but this is a currency exchange speculatory bubble - the inevitable flight from heavily rigged fiat currency means bitcoin is one of the few "safe" investments now - what should be reasonable investment, land, food, industry, metals, etc - are so heavily regulated and controlled they are invalid for that purpose, so artificially overvalued metals like gold and silver (a travesty of wasted good conductor metals that could see industrial use if they weren't sitting in vaults wasting space) become the flight targets.

But gold is saturated and rigged by massively wealthy market holders. Buying gold now is so extremely foolish. Buying bitcoin now that its speculative bubble has come in full swing is also foolish - it was trending towards $50 now, not $250, and will eventually correct back down (and at such time I intend to throw a thousand or two in investing in it for the inevitable rise in valuation when the next major eCommerce site starts accepting it).

But it isn't a long term solution. One problem I see is that 256 bit SHA might be impossible to crack now, but I can't help but predict in 50 years that cracking SHA will be, while still a costly operation, not impossible - if I were trying to invent a cryptocurrency so good as to last centuries, I'd go with SHA3 512 bit right now, albeit SHA3 wasn't finalized in 2008 when the original drafters of bitcoin were developing it.

I also imagine there are ways to get a cryptocurrency more resistant to a majority deception attack like bitcoin is - it doesn't even take 51% of compute power in the swarm to be able to alter transactions, it only takes 30% to stand a sizable chance of pulling off fraud.

Bitcoin is a great first try. In the long run, the best money is mathematics - distilling a secure currency is not an impossible problem. Just a complex one. But Bitcoins weakness is not its security, it is it's deflationary future, and any potential cryptocurrency worth its weight in salt will need to have a fixed, constant increase (as a % of total money, I would say) increase in it's monetary base, to offset lost wallets and keep the money losing value naturally over time so as to keep the currency acting as a means of exchange and not a means of investment.

2013/03/27

An Exploration of Human Potential 4: The Next Generation of Investment

There was a vsauce video that asked a question about the potential of kickstarter to replace Hollywood, and he touches on ways that Hollywood might try to exploit it. So I figured I should write what I think the inevitable conclusion of the crowdfunding "revolution" is, how and why it happened, and what comes next.

Kickstarter, Indiegogo, and its kin all center around the idea of consumers paying for the creation of things they want, often with "perks" for donating certain amounts, often (but not always) including a copy of the thing made. Small donations for things like video games often don't get you a copy of the game, which to me seems very backwards, but I'll get to that.

The major issues with crowdfunding are twofold - one, there is no investor protection clause, if a project reaches its funding goals, you are out money and hoping they make whatever you are banking on. While there is nothing inherently bad about this - it just means you are dumb to invest in people you don't have reasonable expectations will produce the product you want, and if they turn around and run off with your money its your fault for making a risky investment.

The problem with no protections is that unproven and untested developers / produces see magnitudes less interest and contribution than already entrenched groups that have delivered in the past - which is reasonable given the lack of protection, because if you have to choose between something brand new from some random guy trying to launch a project out of their garage, or an industry expert trying to create a sequel to some IP they own, you are obviously going with the latter because you can reasonably assume they will actually make the product. But I'll get to why this is catastrophically bad in a bit.

The second major issue is that you are conflating the real role of crowdfunders - people acting as investors in the creation of new products, ideas, or initiatives - with donation tier gifts that are supposed to appease them of their money. It is an unnecessary indirection, but it is in many ways systemic of the first point - since you have no guarantees your project will actually happen, much more realistic "prizes" work to abate the issue and appease the masses.

The problem is this isn't a macroscopic solution to what I would argue is a systemic issue in the 21st century due to automation and globalization that will see the death of labor markets for physical unskilled work and an increase in the number of people who don't need to work mindless jobs. This means more people can, and should be driven to, enter creative ventures, and anything less than crowdfunding with perverse information ubiquity is disingenuous of society and all the technological innovation made up to this point.

The end goal needs to be that consumers with money have a means of finding people offering to pursue and create new information, head new initiatives, and craft new products for those interested in them, and the ability to directly invest in what you want to see it made. For one, it is the only ethical solution to the tyranny of IP and resolution of broken property rights, and second, it is the best way to resolve the current economic spiral into extreme inequality between the wealthy investor and the paycheck to paycheck laborer.

Back to my first issue, the reason that having no protections is bad is that it means entrenched market forces are disproportionately invested in because they have proven track records - their histories make them less risky to invest in, and that would drive people to put their money in what they feel is the least risky, to see the things they want made with the least chance of losing their investment, and that means entering a market would require an excess of work, often to produce something of similar caliber without the crowdfunding that is meant to enable small ventures from working.

The only solution is to abate the risk aspect. Any venture that operates in this new dichotomy will need to rigorously calcuate their expenses and obectives and produce realistic goals so that people can invest in them without fear of the venture taking the money and running - it would require some legal enforcement, maybe via contract with the exchange operator (aka, the kickstarter.com in this scenario) that and venture proposing needs to actually make what they say or else face lawsuit of fraudulent business practices and monetary extortion. The legal system is a giant mess as well, but that is tangential - in a much more functional legal system, the provider of the exchange service would prosecute any project that fraudulently takes the money and doesn't deliver a product for its "investors" interests, entirely to mitigate the risk of investment in unproven ventures.

Of course, that means anyone going into a crowdfunded project needs to fear being sued for not delivering. That is good. That means they have to be realistic, and that the projects investors can reasonably expect that they get what they pay for.

The resolution to the IP atrocities comes in the form of these projects that are funded being the means to provide the wellbeing and livelihoods of the creators for the duration of the venture - when they produce the product, they give a windowed release target, and meet it - otherwise they are liable for fraudulence. They propose how much they would need to live off for the duration of the venture plus additional expenses, and it is up to the crowdfunders to determine if they are a worthy investment. If they reach their target monetary goal, they get the money and are contractually obliged to return the product in the time frame specified.

This also means the donation goals are unnecessary and detract from the purpose - paying content creators (or any idea creator) for creating such ideas. The information based results of their labor (the blueprints for a robot they build, the movie they make, the program they write) should be released in as close to public domain or at the least an open attribution license as possible, as part of the contract. Once the product is funded by people that want to put the money where their mouth is to see it made, it should be freely available like the information it is. If you do crowdfund the invention of scarce resources like, say, building a cold fusion reactor for a billion dollars, the schematics and blueprints better be public domain, but the reactor itself is obviously owned by the original venture to sell electricity as they wish, because it is finite tangible property with scarcity, and you can't just steal it. Of course, it is up to the contract between the investors and the venture - if they want to build a fusion reactor and give it to the local government that is entirely in their contractual rights, they just need to provide them obfuscated to their investors.

The reason this matters so much is that investment right now is a rigged, closed game for the economic elite. The stock market isn't an accurate investment scheme - many companies on the stock markets could care less what their shares trade at, because they already did an IPO and got their cash reserves. After that, the trading rate never impacts them unless they release more shares into the market. People exchanging company ownership with other people doesn't impact the company at all unless the shareholders take advantage of their 51%+ ownership collectively. Dividend payouts are unrelated to the trading value of a stock, they depend on profit margins. So in practice, the only way to actually invest in new ideas is to be in a closed circle of wealthy investors surrounding an industry who plays the chessboard of ideas to their advantage with behind the scenes agreements and ventures that the public can't engage in - be they agreements between friends to try something new, or a wealthy person just taking a million dollars and trying something for a profit privately - those aren't open investments in what people want.

This becomes more important when we consider we are losing dramatic amounts of middle class power with the decline in income and savings - people don't have the spending power or push to drive the markets anymore because of gross inequality, and the best way to fix that is to open people to supporting one another without their corporate overlord interests controlling what they can buy or enjoy. Moving the engine of creativity and investment back into the hands of the masses means people see what they collectively want made, made, and those that traditionally pull the economic strings lose considerable power if the people paying for the content creators are the people themselves, rather than money hungry investment firms and publishers.

It is absolutely an end game - the collective funding of idea creation is the endgame of every information based industry, but to get there will require considerable cultural, legal, and economic shifts away from the poisonous miasma we exist in today. I hope it can happen in my lifetime, the degree of information freedom we could see in such a world would be wonderful to behold.

2013/03/22

Software Rants 11: Build Systems

So in my forrays into the world of KDE, I quickly came upon the fiasco surrounding KDE4 where the software compilation transistioned its build system from autotools to cmake (in the end, at least). They were originally targeting scons, thought it was too slow, tried to spin their own solution (bad idea) and ended up with Wif, gave up on that and settled on Cmake.

First thing is first, any application that uses some kind of script language (and make / cmake / qmake all absolutely qualify) that use their own syntax and don't derive from any actual script language are inherently stupid. If you encounter a situation where you can say "a domain language is so much more efficient in terms of programmer productivity here than any script syntax" you are probably tackling the problem wrong. Now, 10 years ago, you would have had a legitimate argument - no script language was mature enough to provide a terse syntax that delivered on brevity and ease of use at the time. Now Python, Ruby, and arguably Javascript are all contenders for mainstream script languages with near-human readable syntax and rampant extendability.

So if you see a problem and try to apply a domain language to it, I'm calling you right there as doing it wrong. The overhead of dependencies on a script language interpreter or JIT (like Python) is never worth the mental overhead of having to juggle dozens of domain languages.

And that brings me to build processes, which are one of the most obvious candidates for scripting, but are almost always domain languages, be it from Ant and Maven to Make and Autotools, and the gamut in between, only Scons is reasonable to me, because it is a build environment in python where you write your build scripts in python.

Now that is valuable. Scons is on track, I hope, to not only merge back with Wif, but solve its performance hindrances and deliver a modern build system divergent from the alien syntax of make for a modern use case where external dependencies, caching, and deployment all need to be accounted for.

However, I am stuck looking at cmake for any new project I want to work with, solely due to qtcreator and kdevelop integration. And honestly, if it stays out of my way, I will put up with it. I want to see Scons succeed though, so like the other hundred projects I want to get involved with, I want to see Scons integration in IDEs. I also want to see it solve its performance problems and deliver a solid product.

One thing I wonder is why they didn't use build files in Python but wrote the routines in C or C++ so they could interface the Python Interpreter with some scons.so library.

I definitely think any software you write you intend to be run across many machines should be native. Anything less is a disservice to your clientele, in microseconds of time wasted or in measurable electrical consumption of script interpreters in use cases they don't fit.

A build description for a project? Absolutely script-worthy. The backend to process build scripts? Should be native. The project is a business app with low deployment? Python that sucka. The project is a consumer app? Probably native.

I used to think it would make more sense to write everything in Python and then delve into C++ where performance is needed, but the promises of cross platform porting of qml and qt apps is just too good to pass up.

But yeah, build systems are a fucking mess, and as I continue to write up my Magma manifesto, one of the core tenants is not only compiler level support for Stratos scripts, but the usage of Stratos as the build system from the get go. The modularization instead of textualization of files makes bundle finding and importing a breeze, and the usage of compiled libraries or even whole software packages is just a step away.

2013/03/14

Software Rants 10: The Crossroads of Toolkits, circa 2013

I'm making a prediction - in 5 years, the vast majority of application based software development will be done in one of two environments - html5, or qt.

Those sound radically different, right? What kind of stupid tangent am I going off on now? Well, the biggest thing happening in application world is a transition - the world is slowly getting off the Windows monoculture thanks to mobile, and the most costly shift is that every new device OS, from blackberry to ios to Android to the upcoming Ubuntu Phone and Sailfish, plus the growing GNU/Linux gaming scene and OSX adoption, so the biggest deal is finding a development environment where you can write once, deploy everywhere.

And the only two runners in that race right now are qt and html5. I'd mention Mono and Xamarin, but the C# runtime is so slow and huge on mobile platforms the performance isn't there, and the momentum is moving towards qt anyway. Here are the respective pros and cons:

Qt
  • Optional native performance if written wholly in C++.
  • More practically, write applications in qml (for layout and styling) and javascript, and stick any performance critical parts in C++, and since signals and slots makes the transition seamless, take advantage of rapid deployment and high performance.
  • LGPL apps distribute the qt libraries through a bundled download assistant that will retrieve them once for all local qt apps, so they aren't redundantly cloned. Downside is that with low adoption the downloader is a hindrance for users.
  • Integrates nicely into most toolkits appearances. For example, it uses the options conditional button in Android, and supports gestures.
  • As native apps, qt apps are local, offline capable, and are exposed to the file system and all other niceties of a first class citizen program.
Html
  • Most pervasive platform, but not concrete. qt5 is stable and shippable, and because Digia controls it you can expect forward updates to fall through without a hitch. Banking on html5 webapps means you aren't supported on unupdated devices as much (not that much of a problem in, say, 2 years) but older devices (which tails out more and more as compute power for the consumer plateaus) mean fewer browser updates, and the need to have tag soup trying to figure out what features you have available.
  • Not solid. At all. Webaudio is still sorely lacking, webrtc isn't finalized, webgl is still experimental, and input handling is nonexistent. Local storage is too small to cache any significant amount of an app, especially for offline usage.
  • By being web based, you have inherent access to all the resources of the Internet, whereas qt requires web APIs to access the same things.
  • Inherently cloud based, explicit cloud configuration required for qt.
  • Qt generates installable apps for its target platforms as native local applications. html5 apps are cloud based and thus only as slow as the web page load times to access them to get into. So a lower barrier to entry.
So where is all this crazy "one or the other" mindset coming from? It is becoming increasingly silly and infeasible to use native widget toolkits and languages for every platform you target - what could be boxed up in one html5 or qt app with two skins (one mobile and one desktop) with shared logic and build infrastructure, debugging, and testing, in three languages (qml/js/c++ vs html/js/css) would require, in native form targeting every platform:
  • Objective C + Cocoa for ios
  • Objective C + Quartz for OSX
  • Windows Forms + C#/C++ + win32 for Windows
  • WinRT + C++ for Windows Phone
  • GTK (or qt) + c (or C++) for Linux
  • Java + ADK for Android
  • Qt for Blackberry, Ubuntu Phone, Sailfish (anyway).
With the exception of Windows Phone (which won't succeed, and will bomb pretty bad anyway, and qt could just get platform parity with it if it ever became popular) qt works everywhere, and is actually required on the newest mobile platforms anyway. Likewise, html5 apps will work everywhere as long as you are targeting IE10+, Firefox 16+, Chrome 20+, ios5, Android 4.0+, etc. Qt isn't as limited against backwards systems because it exists natively as a local app.

Nothing else comes close to this device parity of these two platforms. Any new application developer is naive not to use one of these, because all the others listed are dead ends with platform lock in. The plethora of backers of the w3c and Digia are from all these platforms and have interest in promoting their continued growth and the platforms themselves realize that being device transcendent makes them all the more useful.

What I find really interesting is that the interpreted languages of Java / C# are nowhere. Mono is close to being device prolific, but Oracle is a sludge of an outdated bureaucratic death trap and never realizes opportunity since they bought Sun, so they just let Java flounder into obscurity. Which is fine, the language grows at molasses pace and makes me mad to even look at with such critical flaws as no function objects and no default arguments.

But qt does it better, with C++ of all things. I guess GCC / Clang are useful in their architecture proliferation.

Which is one of the main reasons I'm focusing myself on qt, and will be doing my work in the next few months in it. I think it is the future, because at the end of the day, html is still a markup language. It has grown tumors of styling and scripting and has mutated over the years, but you are still browsing markup documents at the end of the day. I just like having access to a system to its core, and qt provides that option when necessary. So I'm betting on qt, and hope it pays off.

Software Rants 9: Capturing the Desktop

In my continuing thinking about next generation operating systems and the ilk, I wanted to outline the various aspects of a system necessary to truly win the world - all the of parts of a whole computing experience that, if presented and superior to all competitors, would probably change the world overnight. No piece can be missing, as can be said of Linux space and its lack of non-linear video editors, GIMP's subpar feature parity against competition, and audio's terrible architecture support. So here are the various categories and things a next generation desktop needs to compromise the consumer space.

Core
  • Microkernel providing consistent device ABI and abstractions. Needs to be preemptive, have a fair low overhead scheduler, and be highly optimized in implementation. The kernel should provide a socket based IPC layer.
  • Driver architecture built around files, interfacing with kernel provided device hooks to control devices. Driver signing for security, but optinal disabling for debugging and testing. Drivers need an explicit debug test harness since they are one of the most important components to minimize bugs in.
  • Init daemon that supports arbitrary payloads, service status logging, catastrophic error recovery, and controlled system failure. The init daemon should initialize the IPC layer for parallel service initialization (think systemd or launchd).
Internals
  • Command shell using an elegant shell script (see: Stratos in shell context). Most applications need to provide CLI implementations to support server functionality.
  • Executor that will checksum and sign check binary payloads, has an intelligent fast library search and inject implementation, and supports debugger / profiler injection without any runtime overhead of standard apps.
  • Hardware side, two interface specifications - serial and parallel digital. Channels are modulated for bandwidth, and dynamic parallel channels allow for point to point bandwidth control on the proverbial northbridge. High privileged devices should use PDI, and low privileged should use SDI. Latency tradeoffs for bandwidth should be modulation specific, so one interface each should be able to transition cleanly from low latency low bandwidth to high latency pipelined bandwidth. Consider a single interface where a single channel parallel is treated as a less privileged interface. Disregard integrated analog interfaces. USB can certainly be implemented as an expansion card.
  • Consider 4 form factors of device profile - mobile, consumer, professional, and server. Each has different UX and thus size / allocation of buses requirements, so target appropriately. Consumer should be at most mini-ITX scale, professional should be at most micro-ATX - we are in the future, we don't need big boards.
Languages
  • Next generation low level systems language, that is meant to utilize every programming paradigm and supply the ability to inline script or ASM code (aka, Magma). Module based, optimized for compiler architecture.
  • A common intermediary bytecode standard to compile both low and middle level languages against, akin to LLVM bytecode. Should support external functionality hooks, like a GC or runtime sandbox. This bytecode should also be signable, checksum-able, and interchangeable over a network pipe (but deterministic execution of bytecode built for a target architecture in a systems programming context is not guaranteed).
  • Middle level garbage collected modularized contextual language for application development. Objectives are to promote device agnosticism, streamline library functionality, while providing development infrastructure to support very large group development, but can also be compiled and used as a binary script language. See : Fissure.
  • High level script language able to tightly integrate into Magma and Fissure. Functions as the shell language, and as a textual script language for plugins and framework scripting on other applications. Meant to be very python-esque, in being a dynamic, unthreaded simple execution environment that promotes programmer productivity and readability at the cost of efficiency (see : Stratos).
  • Source control provided by the system database backend, and source control is pervasive on every folder and file in the system unless explicitly removed. Subvolumes can be declared for treatment like classic source control repositories. This also acts as system restore and if the database is configured redundant acts as backup.
Storage
  • Copy on write, online compressing transparent filesystem with branch caching, auto defragmentation, with distributed metadata, RAID support, and cross volume partitioning. Target ZFS level security and data integrity.
  • Everything-as-a-file transparent filesystem - devices, services, network locations, processes, memory, etc as filesystem data structures. Per-application (and thus per-user) filesystem view scopes. See the next gen FS layout specification for more information.
  • Hardware side, target solid state storage with an everything-as-cache storage policy - provide metrics to integrate arbitrary cache layers into the system caching daemon, use learning readahead to predict usage, and use the tried and true dumb space local and time local caching policy.

Networking
  • Backwards compatibility with the ipv6 network transport layer, TCP/IP/UDP, TLS security, with full stack support for html / css / ecmascript complaint documents over them.
  • Rich markup document format with WYSIWYG editor support, scripting, and styling. Meant to work in parallel with a traditional TCP stack.
  • Next generation distributed network without centralization support, with point to point connectivity and neighborhood acknowledgement. Meant to act both as a LAN protocol for both simple file transfer, service publication (displays, video, audio, printers, inputs, to software like video libraries, databases, etc) that can be deployed wideband as a public Internet without centralization.
  • Discard support for ipv4, ftp, nfs, smb, vnc, etc protocols in favor of modern solution.
Video
  • Only a 3d rendering API where 2d is a reduced set case. All hardware is expected to be heterogeneous SIMD and complex processing, so this API is published on every device. Since Magma has SIMD instruction support, this API uses Magma in the simd context instead of something arbitrary like GLSL. Is a standard library feature of low.
  • Hardware graphics drivers only need support the rendering API in its device implementation, and executor will allocate instructions against it. No special OS specific hooks necessary. Even better, one standard linkable may be provided that backends present gpu hardware or falls back to pipelined core usage.
  • No need for a display server / service, since all applications work through a single rendering API. A desktop environment is just like any 3d application running in a virtual window, it just runs at the service level and can thus take control of a display (in terms of access privileges, user applications can't ever take control of a display, and the best they can do is negotiate with the environment to run in a chromeless fullscreen window).
  • Complete non-linear video editor and splicer that is on par with Vegas.
  • Compete 3d modeler / animator / scene propagator supporting dae, cad, and system formats.
  • System wide hardware video rendering backend library supporting legacy formats and system provided ones, found in Magma's std.
  • Complete 2d vector and raster image composer, better UX and feature parity than Gimp, at least on par with photoshop. Think Inkscape + sai.
  • 3d (and by extension, fallback 2d) ORM game engine implemented in Magma provided as a service for game makers. Should also have a complete SDK for development, use models developed in our modeler.
  • Cloud video publishing service baked into a complete content creation platform.
  • Art publishing service akin to DA on the content creation platform.
  • Saves use version control and continuous saving through DB caching to keep persistent save clones.
Audio
  • Like Video, a single 3d audio API devices need to support at the driver level (which means positional and point to point audio support). Standards should be a highly optimized variable bitrate container format.
  • Software only mixing and equalizing, supplied by the OS primary audio service, and controllable by the user. Each user would have a profile, like they would have a video profile.
  • Audio mixing software of at least the quality of Audacity and with much better UX.
  • Audio production suite much better than garageband.
  • System wide audio backend (provided in Magma's std) that supports legacy and system formats.
  • Audio publishing service akin to bandcamp in a content creation platform.

Textual
  • Systemic backend database assumed present in some object mapping API specified in Magma. Different runlevels have access to different table groups and access privilege applies to the database server. This way, all applications can use a centralized database-in-filesystem repostiory rather than running their own. Note : database shards and tables are stored app-local rather than in a behemoth registry-style layout, and are loaded on demand rather than as one giant backend. The database server just manages independently storage. The database files use the standard serialization format, so users can write custom configurations easily. These files, of course, can be encrypted.
  • Since the database is inherently scriptable, you can store spreadsheets in it. It can also act as a version control repository, so all documents are version controlled. 
  • Singular document standard, supporting scripting and styling, used as local WYSIWYG based binary or textual saved documents, or as "web" pages.
  • Integrated development environment using gui source control hooks, support for the system debugger and profiler, consoles, collaborative editing, live previews, designer hooks, etc. Should be written in Magma, and load on demand features. Target qtcreator, not visual studio / eclipse.
Security
  • Pervasive, executable based mandatory access control. Profiles are file based, scripted in the standard serialization format, should be simple to modify and configure with admin privildges.
  • Contextual file system views, as a part of MAC, an application can only "see" what it is allowed to see, in a restricted context.
  • Binary signing pervasively, keys stored in central database.
  • Folder, file, and drive based encryption. An encrypted system partition can be unlocked by a RAMFS boot behavior.
  • Device layer passwords are supported as encryption keys. The disk is encrypted with the password as the key, instead of the traditional independent behavior where you can just read the contents off a password protected disk.
  • Network security implied - the firewall has a deny policy, as do system services. Fail2ban is included with reasonable base parameters that can be modified system wide or per service. All network connections on the system protocol negotiate secure connections and use a hosted key repository with the name server for credentials exchange and validation.
Input
  • Going to need to support arbitrary key layouts with arbitrary glyphic key symbol correlations. Think utf8 key codes. Vector based dimensional visual movement, which can be implemented as touch, mouse, rotation, joysticks, etc. So the two input standards are motion and key codes.
  • Input devices provided as files in the FS (duh) and simple input APIs provided in Magma.
If you can make the experience of content and software creators sufficiently extravagant, you can capture markets. We live in an era of constant global communication, such an OS needs to take full advantage at every level of pervasive communication, including network caching. Since the language stack is designed to provide a vertical contextual development paradigm, almost all resources are implemented in Magma libraries with bindings everywhere else up the stack as appropriate. Since most devices, services, etc are provided as files, the library implementations can be simple and platform agnostic given file provisioning.

2013/03/06

Reddit Rants 2: Mir Fallout Ranting

This time, in the wake of Mir's unveiling as the new Ubuntu display server, I was retorting someone saying fragmentation isn't a problem here and that the competition Mir would produce would be positive and get Wayland developed faster, here is my retort:

The correct way to go about FOSS development is:

Explore Options -> engage with open contribution projects in the same space -> attempt to contribute to the already established product, improving it into what you need, given community support -> if that doesn't happen, consider forking -> if forking is not good enough and you need to rebase, start from scratch.

Canonical skipped to the last step. It is fine if you have no other option but to fragment because then you are representing some market segment whose needs are not met.

The needs of a next generation display server that can run on any device with minimal overhead, sane input handling, and network abstraction already exists and it is in a stable API state with running examples, called Wayland.

The problem with Mir and Canonical is that unlike community projects and community engagement, Canonical doesn't give a crap about what the community thinks. They maintain Upstart because fuck you, they created bazaar in an era of git because fuck you, they maintain a pointless compositor named Compiz because fuck you, they invented a UI you could easily recreate in Plasma or Xfce or even Mate with slight modification but they did it from scratch and introduced a standardless mess of a application controls API because fuck you.

They want to control the whole stack, not play ball. They got way too much momentum from the Linux community in an era when Fedora was still mediocre, Arch didn't exist (and is still too user unfriendly) Debian was still slow as hell, opensuse was barely beginning, and the FOSS ecosystem wanted to rally around a big player in the consumer space, where redhat was in the server space.

Mir is bad because it will persist regardless of its merit. Solely because Canonical would never give up and depreciate it the same way they are still trying to advertise an Ubuntu TV 2 years later with no working demo, Canonical will now steal any momentum X / Wayland have towards reasonable graphics driver support and possibly steal the entire gpu manufacturers support away from what is shaping up to be the much more technically promising and openly developed project in the form of Wayland.

SurfaceFlinger is a great comparison. Mir will be just like that. It will eat up hardware support into an unusable backend that can't easily mesh with modern display servers but hardware manufacturers don't support multiple display servers. So if Mir crashes and burns, interest in Linux wanes because it looks like "same old fragmented unstable OS" and if it doesn't its completely detached from the FOSS community anyway under the CLA and Canonical will control it entirely to their will.

It isn't a question of communal merit. Canonical doesn't play that way. That is why this is so bad. It is fine if the top level programs are fragmented and disparate, because that presents workflow choice. The OS display server, audio backend, network stack, init daemon are not traditionally user experience altering, they are developer altering. If you want developers, you point them to one technically potent stack of tools well implemented by smart people with collective support behind them so they can make cool things and expect them to run on the OS. That isn't the case when you have 3 display servers, 3 audio backends, 3 init daemons, 500 package formats, etc.

I also wrote a shorter reponse on a HN thread:

I'm personally not too worried here. The thing is both Wayland and Mir will be able to run X on top of them, so currently all available GUI programs will still work.

What matters is the "winner". They will both hit mainstream usage, we will see which one is easier to develop for, and that one will take off. If Mir's claims of fixing input / specialization issues in Wayland comes to fruition, then it will probably win. If Mir hits like Unity, or atrophies like Upstart, then Wayland will probably win.

The problem is his Wayland fails everyone can switch to Mir. If Mir proves weaker, we are stuck with a more fragmented desktop space because Canonical doesn't change their minds on these things.

I also played prophet a bit on phoronix (moronix?) about how this will pan out:

There are only 3 real ways this will end.

1. Canonical, for pretty much the first time ever, produces original complex software that works on time, and does its job well enough to hit all versions of Ubuntu in a working state (aka, not Unity in 11.04). By nature of being a corporate entity pushing adoption, and in collusion with Valve + GPU vendors, Mir sees adoption in the steambox space (in a year) and gets driver support from Nvidia / ATI / Qualcomm / etc. Mir wins, regardless of technical merit, by just having the support infrastrcture coalescing around it. Desktop Linux suffers as Canonical directs Mir to their needs and wants, closes development under the CLA, and stifles innovation in the display server space even worse than the stagnation of X for a decade caused.

2. Mir turns out like most Canonical projects as fluff, delay, and unimpressive results. The consequence is that Ubuntu as a platform suffers, mainstream adoption of GNU loses is once again kicked back a few pegs since distributors like system76 / Dell / Hp can't realistically be selling Ubuntu laptops will a defective display server and protocol, but nobody else has been pushing hard on hardware sold to consumers with any other distro (openSuse or Fedora seem like the runner up viable candidates, though). Valve probably withdraws some gaming support because of the whole fiasco, and gpu drivers don't improve at all because Mir flops and Wayland doesn't get the industry visibility it needs, and its potential is thrown into question by business since Canonical so eagerly just ignored it. The result is we are practically stuck with X for an extradited period of time since nobody is migrating to Wayland because Mir took all the momentum out of the push to drop X.

3. The best outcome is that Mir crashes and burns, Wayland is perfect by years end and can be shipping in mainstream distros, someone at Free Desktop / Red Hat gets inroads enough with AMD / Nvidia to get them to either focus entirely on the open source drivers to support Wayland (best case) or refactor their proprietary ones to work well on Wayland (and better than they do right now on X). The pressure from desktop graphics and the portability of Wayland, given Nvidia supporting it on Tegra as well, might pressure hard line ARM gpu vendors to also support Wayland. The open development and removal of the burden of X mean a new era of Linux graphics, sunshine and rainbows. Ubuntu basically crashes and burns since toolkits and drivers don't support Mir well, or at all, and Canonical being the bullheaded business it is would never consider using the open standard (hic, systemd, git).

Sadly, the second one is the most likely.

My TLDR conclusions are Mir is just a powergrab by Canonical as they continue to rebuild the GNU stack under their CLA license and their control. I don't have a problem with them trying to do vertical integration of their own thing, but it hurts the Linux ecosystem that made them what they are today to fork off like this and it will ruin the momentum of adoption the movement has right now, which makes me sad.

Reddit Rants 1 : Plan 9

I'm going to start posting some of the rants I put on reddit that get some traction here as well, for posteritys sake. Because all the crap I've been saying is so important I should imortalize it in blag form forever!

This one was in a thread about plan9, got the most upvotes in the thread I think, and illustrates what it was and why it happened. Consider it a follow up to my attempts at running plan9 after all that research I did.

> I wiki'd Plan 9 but an someone give me a summary of why Plan 9 is important?

  1. It was a solution to a late 80s problem that came up, that never was solved because technology outpaced it. Back then, there were 2 kinds of computers - giant room sized behemoth servers to do any real work on, and workstations for terminals. And they barely talked. Plan 9, because of how the kernel is modularized, allows one user session to have its processors hosted on one machine, the memory on another, the hard disks some place else, the display completely independent of all those machines, and they can input somewhere else, and the 9p protocol lets all those communications be done over a network line securely. So you could have dozens of terminals running off a server, or you could just (in a live session) load up the computation server to do something cpu intensive. The entire system, every part, was inherently distributed.
  2. It treated every transaction as a single protocol, so that networked and local device access would be done under 9p, and the real goal was to make it so that any resource anywhere could be mounted as a filesystem and retrieved as a file. It had files for every device in the system well beyond what Linux /dev provides, and it had almost no system calls because most of that work was done writing or reading from system files. About the only major ones were read, write, open, and close, which were dynamic on the type of interaction taking place and could do radically different things (call functions in a device driver, mount and stream a file over a network, or read a normal file from a local volume).
  3. File systems could be overlayed on one another and had namespaces, so that you could have two distinct device folders, merge them into one in the VFS, and treat /dev as one folder even if the actual contents are in multiple places. Likewise, each running program got its own view of the file system specific to its privileges and requirements, so access to devices like keyboards, mice, network devices, disks, etc could be restricted on a per application basis by specifying what it can or can not read or write to in the file system.
  4. This might sound strage, but graphics are a more first class citizen in plan9 than they were in Unix. The display manager is a kernel driver itself, so unlike X it isn't userspace. The system wasn't designed to have a layer of teletypes under the graphics environment, they were discrete concepts.
Plan9 was full of great concepts and ideas in modern OS design. The problem became that it was always a research OS, so nobody tried using it in production. The same reason Minix is pretty low ball - it didn't have a major market driver for adoption. The problem it solved best, distributed systems using heterogeneous compoents spread across a network, became less of a problem as compute power improved, and the growth of the internet allowed similar behaviors to be overlayed on top of complete systems. The overhead of running entire operating systems to utilize network resources has never since been high enough to justify taking some of the radical departures (good ones, I think) plan9 made.
Today, it is kind of old (it doesn't fully support ANSI C, for example, and doesn't use the standard layout of libraries) and while it is realistically possible that if GCC and glibc were ported to plan9 fully, that you could build a pretty complete stack out of already available FOSS Linux programs, the target audience of plan9 is developers who really like dealing with files rather than arbitrary system calls, communication protocols, signal clones, etc.


I'll argue some flaws of plan9 (I also posted a lot of positives earlier up this thread...) on a lower level:

1.  It doesn't support ansi C, and uses its own standard library layout for its C compiler.  Because the OS threw out the sink, porting the GNU coreutils, glibc, and GCC would take a ton of effort.  So nobody takes the initiative.

2.  9p is another case of xkcd-esque standard protocols mess.  Especially today - I would make the argument IP as a filesystem protocol would probably make the most sense in a "new" OS, because you can change local crap much easier than you can change the world from using the internet.  *Especially* since ipv6 has the lower bytes as local addressing - you can easily partition that space into a nice collection of addressable PIDs, system services, and can still use loopback to access the file system (and if you take the plan9 approach with application level filesystem scopes, its easy to get to the top of your personal vfs).

3.  It is *too old*.  Linux of today is nothing like Linux of 1995 for the most part.  Almost every line since Linux 2.0 has been rewritten at least once.  plan9, due to not having as large a developer community, has a stale codebase that has aged a lot.  The consequence is that it is still built with coaxial ports, vga, svideo, IDE, etc in mind rather than modern interfaces and devices like PCIE, SATA, hdmi, usb, etc.  While they successfully added these on top, a lot of the behaviors of the OS were a product of its times when dealing with devices, and it shows.  This is the main reason I feel you have an issue with the GUI and text editor - they were written in the early 90s and have nary been updated that much since.  Compare rio to beOS, OS/2, Windows 95, or Mac OS 8.

A lot of the *ideas* (system resources provided as files, application VFS scopes, a unified protocol to access every resource) are *amazing* and I want them everywhere else.  The problem is that those don't show themselves off to the people who make decisions to back operating systems projects as much.

In closing (of the blog post, now) I still think I'd love to dedicate a few years to making a more modern computing platform that NT / Unix / whatever iOS is. I've illustrated my ideas elsewhere, and I will soon be posting a blog post linking to a more conceptualized language definition of that low-level language I was thinking of (I have a formal grammar for, I'm just speccing out a standard library.