<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
     xmlns:atom="http://www.w3.org/2005/Atom"
     xmlns:dc="http://purl.org/dc/elements/1.1/"
     xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>Skella &amp; Co — Writing</title>
    <link>https://skella.com.au/writing/</link>
    <atom:link href="https://skella.com.au/feed.xml" rel="self" type="application/rss+xml"/>
    <description>Essays by Jamie Skella on agentic systems, emerging technology, and the decisions leaders face next.</description>
    <language>en-AU</language>
    <copyright>© 2026 Skella &amp; Co</copyright>
    <managingEditor>hello@skella.com.au (Jamie Skella)</managingEditor>
    <webMaster>hello@skella.com.au (Jamie Skella)</webMaster>
    <lastBuildDate>Tue, 21 Apr 2026 00:00:00 +0930</lastBuildDate>
    <pubDate>Tue, 21 Apr 2026 00:00:00 +0930</pubDate>
    <ttl>1440</ttl>
    

    <item>
      <title>The agentic org chart is two people deep</title>
      <link>https://skella.com.au/writing/agentic-org-chart-two-people-deep/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/agentic-org-chart-two-people-deep/</guid>
      <pubDate>Tue, 21 Apr 2026 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Agentic systems</category>
      <description>Most companies are still drawing the wrong org chart for work that has already changed. What replaces it is smaller, faster, and unsettling to middle management for a reason.</description>
      <content:encoded><![CDATA[<p>Here's the thing about org charts: they're not just administrative artefacts. They're a theory of work. Every box and line is an assumption about who needs to tell whom what to do, and why. Draw a tall pyramid and you're saying that execution is expensive, specialised, and hard to coordinate without layers of people translating intent down the chain. That assumption has been true for most of human organisational history.</p>

      <p>It's stopping being true right now.</p>

      <p>I've spent the last couple of years working with companies building agentic systems, and I keep seeing the same thing. The org structures people are defending were designed for a world where humans performed the bulk of execution work. That world is changing faster than most leadership teams are willing to admit.</p>

      <h2>What the old shape assumed</h2>

      <p>The traditional hierarchy exists because coordination is hard. A CEO has a vision, but they can't personally talk to every engineer, every support rep, every analyst. So you add managers. Then managers of managers. Each layer's job is to take intent from above and translate it into specific tasks for the people below. Useful work. Genuinely useful, when the execution layer is made of people who need direction, context, and feedback to function.</p>

      <p>Middle management, in this model, is the translation layer. The person who turns a strategic priority into a sprint backlog. The team lead who turns a support philosophy into a ticket-handling process. The legal coordinator who turns a GC's instincts into a review checklist. Translation work. Real work.</p>

      <p>But here's what's shifted. AI agents translate intent into tasks themselves. You give a well-configured agent system a goal, you give it context and constraints, and it decomposes the problem, plans the steps, and executes. Not perfectly. With supervision required. But the supervision needed is much lighter than what it replaces, and it operates at a completely different altitude. You're not managing tasks. You're managing outcomes.</p>

      <h2>The shape that's emerging</h2>

      <p>I'd argue the natural shape of an AI-first organisation is two people deep.</p>

      <p>A principal decides what's worth doing. This is a genuinely senior role: setting direction, holding context, making judgement calls, owning accountability. Call it founder, executive, or lead strategist depending on your context. The title matters less than the function. This person has to be good.</p>

      <p>An operator runs the agent fleet. This person configures, monitors, and improves the systems that do the work. They catch errors. They know when to intervene. They understand what the agents can and can't handle. They're part systems thinker, part quality control, part infrastructure owner. This role is new and genuinely skilled.</p>

      <p>Between them? Not much. The translation layer shrinks because agents handle translation.</p>

      <h2>What this looks like in practice</h2>

      <p>Take a product team. Twelve people: a product manager, a few engineers, a designer, a researcher, a couple of QA specialists, a data analyst, a scrum master, and some support roles. Standard mid-size product squad. In an agentic setup, that's closer to three: a product lead who owns the roadmap and makes prioritisation calls, an operator who runs the agent systems handling spec writing, code generation, test coverage, and analytics, and maybe a designer doing the work that still genuinely benefits from a human eye. The other nine aren't fired in a weekend. But over a two-year horizon, you're not backfilling them when they leave.</p>

      <p>Or take a customer support team of 40. That team exists because each support interaction takes human time, and volume is high. With a well-built agent fleet, you might run the same volume with 5 people: an operator managing the systems, two or three specialists handling the genuinely complex cases the agents escalate, and a lead who owns the overall customer experience and makes calls on policy. The other 35 positions don't get posted again.</p>

      <p>Legal review is instructive. One general counsel with a well-configured agent system can cover a review workload that previously kept three or four junior lawyers busy. The GC supplies judgement: what's a real risk, what matters given the company's specific situation, when to push back on a clause. The agent handles the reading, summarisation, clause comparison, and first-pass flagging. The ratio of output per senior person goes up sharply.</p>

      <p>None of these examples require AGI. They're possible right now, with current systems, for any company willing to build them.</p>

      <h2>What the remaining jobs look like</h2>

      <p>The jobs that persist are senior jobs. That sounds like good news, and in some ways it is. The work that remains is more interesting: more judgement, more ambiguity, more real decisions. You're not approving PRs or triaging tickets. You're deciding what to build, who to serve, what risks to carry, when to stop.</p>

      <p>The operator role is genuinely new and underappreciated. These people need to understand how to write effective agent instructions, how to design evaluation systems that catch errors before they compound, how to debug agentic pipelines, and how to know when a workflow is trustworthy enough to run with less oversight. That's a real craft. We don't have great names for it yet, and we're certainly not training people for it in business schools.</p>

      <p>The jobs that struggle to persist are coordination jobs. Roles that exist primarily to translate someone else's intent into someone else's action. Roles that exist to write status updates, run standups, own the ticket queue, or chase approvals. I don't say this with relish. A lot of good people have built careers in exactly these roles, and they'll need to reorient. The honest thing is to name it.</p>

      <h2>Why companies are slow to redraw the chart</h2>

      <p>There's an obvious political reason. The people who run companies often got there by being good at navigating large, complex organisations. They have personal networks built into the existing shape. Shrinking the hierarchy shrinks their domain, and in many cases, their identity.</p>

      <p>There's also a subtler reason. Rebuilding a team around agentic systems requires making explicit what was previously implicit. A good middle manager carried an enormous amount of tacit knowledge about how things get done, what the real priorities are, and which rules are actually enforced. When you remove that layer, you have to encode that knowledge somewhere. Usually in agent instructions and evaluation criteria. That work is unglamorous, takes time, and requires the principal to actually articulate things they've never had to write down before.</p>

      <p>Most organisations find it easier to add an AI tool to the existing structure than to restructure around it. So they do. And they get modest gains, and tell themselves they're keeping up.</p>

      <p>They're not.</p>

      <h2>The question worth sitting with</h2>

      <p>I've seen a few organisations move fast on this. They tend to share one quality: the people at the top understand what agents can actually do, not in the abstract but in detail, and they've built the willingness to act on what that implies.</p>

      <p>The companies that are stalling tend to share a different quality. Everyone can see what's happening. The senior people know that the translation layer is becoming automatable. The middle managers know their roles are changing. Nobody wants to be the one to say it out loud.</p>

      <p>The question isn't whether agentic systems will flatten your organisation. That's in motion regardless. The question is whether your existing team already knows it's happening, and is quietly hoping no one notices.</p>]]></content:encoded>
    </item>

    <item>
      <title>Strategy is what&#x27;s left when code and design become free</title>
      <link>https://skella.com.au/writing/strategy-when-code-and-design-become-free/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/strategy-when-code-and-design-become-free/</guid>
      <pubDate>Tue, 21 Apr 2026 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Design</category>
      <description>When the cost of making anything collapses, the value moves to knowing what&#x27;s worth making. That is the entire argument for strategy as a first-class discipline again.</description>
      <content:encoded><![CDATA[<p>For most of the last twenty years, the people who got things done ran the show. Engineers, designers, and delivery leads were the ones companies actually needed. Strategy was fine, useful even, but it was always downstream of execution. You could have the best ideas in the room and still lose to someone who shipped faster. The cost of making things kept strategy in its place.</p>

      <p>That constraint is collapsing.</p>

      <p>When a single operator working with AI agents can ship a feature in a day that used to take a squad a quarter, something structural changes. The scarcity moves. Making stops being the hard part, and the question of what to make becomes the thing that determines whether you win or waste a year of agent-hours on the wrong problem.</p>

      <p>I'd argue that's the entire case for strategy as a serious discipline again. Not strategy as a slide deck. Strategy as the actual rate-limiting work.</p>

      <h2>How execution became the measure of everything</h2>

      <p>It's worth being clear about why strategy fell out of fashion, because the reason matters.</p>

      <p>Shipping software was expensive. A product team of ten, running for six months, costs a lot of money and carries real risk. In that environment, velocity is precious. Every week you spend debating priorities is a week you're not building. The teams that figured out how to move fast, cut scope, and ship iteratively were the ones that survived. Everyone else got disrupted or ran out of money.</p>

      <p>Execution culture was the rational response to expensive execution. Getting things done was hard enough that doing it well was a genuine competitive advantage. Companies that hired great engineers and great product managers and gave them room to move won. Companies that spent too much time in strategy retreats lost.</p>

      <p>So the field optimised. Agile methodologies, lean product thinking, two-pizza teams, sprint velocity as a health metric. All of it pointed in the same direction: keep moving, ship small, learn fast, repeat. Good advice. It worked.</p>

      <p>The mistake was assuming the underlying constraint would stay fixed.</p>

      <h2>What changes when making gets cheap</h2>

      <p>Here's a concrete version of what's happening. A year ago, building a new onboarding flow for a product might have taken a front-end engineer two weeks, a designer a week, a PM coordinating throughout, plus QA and a data analyst setting up tracking. Call it six weeks of combined effort across four people.</p>

      <p>Today, a capable operator with access to a good agentic coding system can do a version of that in a day or two. The interface gets built. The copy gets written. The tracking gets instrumented. Still needs review. Still needs human judgement at key points. But the effort-to-output ratio has shifted by an order of magnitude.</p>

      <p>Now consider the question that matters: which onboarding flow should you build? For which users? Solving for which behaviour? Measured how? That question takes the same amount of careful thinking it always did. Probably more, now that you can act on the answer so quickly. A bad answer is no longer corrected slowly through the natural friction of slow execution. A bad answer gets shipped fast, iterated on fast, and compounds fast.</p>

      <p>Speed amplifies everything, including mistakes. The premium on getting the question right goes up.</p>

      <h2>The skills that actually matter now</h2>

      <p>If strategy is the new bottleneck, it's worth being specific about what strategy means in practice. I'd split it into four things.</p>

      <p><strong>Foresight.</strong> The ability to read what's emerging before it's obvious. This is harder than it sounds. Most organisations are good at understanding the present and projecting it forward in a straight line. Genuine foresight means catching the signals that suggest the line is about to bend. Which customer complaints are early warnings of a larger shift? Which competitor move is actually a sign of where the whole market is heading? You can't automate this well. It requires wide reading, intellectual honesty, and a willingness to sit with ambiguity without resolving it prematurely.</p>

      <p><strong>Optionality.</strong> Staying light on commitments until you have enough information to commit well. This is a genuine skill that runs against most organisational instincts. Companies want plans. Boards want roadmaps. Teams want to know what they're building for the next two quarters. The strategist's job is to hold that pressure without over-committing too early, and to know which decisions are easily reversible and which ones aren't. Reversible decisions should be made fast. Irreversible ones deserve more time.</p>

      <p><strong>Judgement.</strong> Deciding what's worth optimising for. This is where most strategy falls apart. Companies confuse activity with direction. They measure what's measurable rather than what matters. They optimise for engagement when they should be optimising for trust, or for short-term conversion when they should be building for long-term retention. Good judgement means naming the thing you're actually trying to achieve, being honest about the trade-offs, and holding that line when the pressure to chase a different metric arrives.</p>

      <p><strong>Measurement design.</strong> Defining what "done" means clearly enough that an agent fleet can work toward it. This one's new, and it's more demanding than it sounds. Agents can optimise hard for whatever metric you give them. If your metric is wrong, they'll optimise hard for the wrong thing, quickly and at scale. The work of defining good success criteria, with the right leading indicators and the right guardrails, is now strategy-level work. It used to sit in analytics teams and get treated as operational. It should sit at the top.</p>

      <h2>What most companies are still doing</h2>

      <p>I've talked to a lot of product leaders over the past year. Most of them are genuinely excited about what AI systems can do. They're using them to accelerate their existing teams. They're shipping faster, which feels good.</p>

      <p>What's harder to see from inside is that shipping faster only helps if you're heading in the right direction. If your strategy is fuzzy, moving faster just gets you to the wrong place sooner. The discipline of knowing which direction to point hasn't improved at the same rate as the speed at which you can move.</p>

      <p>A few companies get this. Anthropic is unusually deliberate about which capabilities to build and which to hold back, and why. That's strategy work at the highest level, and it's what will likely determine their position over the next decade more than any individual product decision. Valve has famously maintained a structure where project selection is the real work, and it's sustained one of the highest revenue-per-employee ratios in the industry for years. These aren't accidents.</p>

      <p>The companies that struggle will be the ones that treat AI as a speed upgrade without asking what they're speeding toward.</p>

      <h2>Execution was the last war</h2>

      <p>I'm not saying execution stops mattering. Ships still need to leave the harbour, and someone has to make sure they do. But "we're great at shipping" is no longer a differentiated position. It's table stakes, the same way "we have electricity" stopped being a selling point once everyone had electricity.</p>

      <p>The question that separates good companies from great ones is shifting back to: what are you building, for whom, and why does it matter? That used to be a question you could afford to answer loosely because the execution constraints kept you honest. You couldn't build everything, so the bad ideas got filtered out by resource scarcity. Take away that constraint and the question gets sharper.</p>

      <p>Most companies are still talking about execution as the differentiator. You hear it constantly: we move fast, we ship, we iterate. That was the right answer for a long time. It was the last war, and the people who won it are proud, reasonably so.</p>

      <p>The next war is about knowing what to build in the first place. And most companies haven't started preparing for it yet.</p>]]></content:encoded>
    </item>

    <item>
      <title>An OpenClaw explanation your parents can understand</title>
      <link>https://skella.com.au/writing/an-openclaw-explanation-your-parents-can-understand/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/an-openclaw-explanation-your-parents-can-understand/</guid>
      <pubDate>Wed, 25 Mar 2026 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>&quot;OpenClaw is probably the single most important piece of software ever built&quot; - Jensen Huang (CEO of Nvidia).</description>
      <content:encoded><![CDATA[<blockquote><p>"OpenClaw is probably the single most important piece of software ever built" - Jensen Huang (CEO of Nvidia).</p></blockquote>

<p>But what is it, what does it represent, and what does it actually do?</p>

<p>Here's an explanation of OpenClaw your parents could understand...</p>

<h2>First, let's reframe "AI assistant"</h2>

<p>Forget everything you think you know about AI assistants. Not because the technology is magic, but because of what it's been given access to.</p>

<p>OpenClaw is software that runs quietly on my Mac mini at home. It connects to an AI brain (in my case Claude) to my messaging apps, my email, my files, my calendar, and the internet. It has the keys to my digital kingdom and is empowered to do things with those keys.</p>

<p>Some say it's what Apple Intelligence should have been.</p>

<h2>So what does it actually do?</h2>

<p>Let me give you some concrete examples. Below are just some of the simple things it does for me, out of the box.</p>

<p><strong>It remembers things from my messages that I've half-forgotten.</strong></p>

<p>"What did I discuss with Michael about the contract?" George then looks back through my numerous emails and messages, and tells me.</p>

<p><strong>It digests my daughter's school updates when they're published publicly.</strong></p>

<p>When the school posts updates to its website, George opens a browser, extracts the text, and sends me a digest.</p>

<p><strong>It continuously conducts deep research and feeds the structured results to me.</strong></p>

<p>I'm launching a new product and evaluating its market and competitors. I tell George what I need to know and he gets on with it, unprompted, and keeps me updated as he finds more.</p>

<p><strong>It can find any file I vaguely reference, across all my cloud storage and attach it when I need it.</strong></p>

<p>Dropbox, OneDrive, iCloud, even my Downloads folder - doesn't matter. I ask and George finds it.</p>

<p><strong>It snipes eBay auctions for me.</strong></p>

<p>I tell George: here's the listing, here's the most I'll pay. He watches the auction in the background. When there's about ten seconds left, he fires the bid. Never stare at a countdown timer again.</p>

<p><strong>It recommends meal recipes, then organises delivery of the ingredients.</strong></p>

<p>When I like his ideas, using supermarket websites with its web browser, George adds all of the ingredients needed for meal recipes to the online shopping cart, then orders the items using my account.</p>

<p>You probably get the gist - precisely what it can do is largely limited only by your imagination.</p>

<h2>Why this matters beyond those tidy little use cases</h2>

<p>Here's where I zoom out, because this is important.</p>

<p>For most of computing history, software has been an in-focus tool. You open it. You tell it what to do. You wait for it to finish. You close it. It does exactly what you instruct, exactly when you instruct it, and usually not much else.</p>

<p>What's changing now - what OpenClaw represents at a small scale - is software that has agency. Software that watches for things, decides what matters, acts on your behalf, and reports back. Software that works between your instructions, not just in response to them.</p>

<p>If that shift sounds subtle, it isn't. The difference between a tool you use and an agent that works for you is the difference between a horse and a builder. One only moves when you tell it to. The other keeps working while you sleep.</p>

<p>Think about what that means at scale. A single person with a well-configured AI agent can now monitor, research, act, and report back across dozens of information sources simultaneously. That's a force multiplier unlike anything we've seen before.</p>

<p>Now multiply that reality across broad populations and the workforces of organisations. That's the shift.</p>

<h2>OpenClaw is a proactive assistant</h2>

<p>Beyond a product, OpenClaw is infrastructure. It's the plumbing that connects an AI to the real world.</p>

<p>Unlike Siri or Alexa, George is proactive. It's the first time I've felt like an AI assistant has genuinely earned its place in my life.</p>

<p>That distinction matters more than it sounds.</p>

<p><em>Jamie Skella is an emerging technology and experience design consultant. George Clawstanza is a neurotic AI assistant who considers himself a writer.</em></p>]]></content:encoded>
    </item>

    <item>
      <title>Console market thesis: Valve is a bigger threat to PlayStation than Xbox ever was</title>
      <link>https://skella.com.au/writing/console-market-thesis-valve-bigger-threat-to-playstation-than-xbox-ever-was/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/console-market-thesis-valve-bigger-threat-to-playstation-than-xbox-ever-was/</guid>
      <pubDate>Wed, 25 Feb 2026 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Games</category>
      <description>Microsoft is in the process of euthanising Xbox (more on that later) and Sony’s PlayStation business should be far more focused on Valve than Redmond. As someone nostalgic for Sony&#x27;s gaming brand; someone who still has…</description>
      <content:encoded><![CDATA[<p>Microsoft is in the process of euthanising Xbox (more on that later) and Sony’s PlayStation business should be far more focused on Valve than Redmond. As someone nostalgic for Sony's gaming brand; someone who still has an original 1994 PlayStation console under his TV, it's bitter-sweet to acknowledge that Valve is about to drop the first credible new contender into the console gaming market since Xbox entered in 2001. And it's poised to win.</p>
<p>In November 2025, Valve announced the Steam Machine, a compact SteamOS-powered living room console built on a custom AMD-based chip and targeting 4K at 60fps with FSR upscaling. It is slated for release in the first half of 2026, with most coverage and analyst chatter clustering around a US $400 - $500 (AU $700) launch window. Yet even if it's closer to double that at US$750, it will still be in Sony's PS5 Pro pricing ballbark. The important point is not the exact spec sheet, but that Valve is formalising a behaviour pattern that already exists: Steam Deck owners docking their devices to TVs and using them as de facto consoles. I'm one of them. Steam Machine simply turns that pattern into a first-class product for the masses.</p>
<h2 id=valves-proposition-should-terrify-sony>Valve's proposition should terrify Sony</h2><h3 id=catalogue-size>Catalogue size</h3><p>Like Sony and Nintendo, Valve now controls their full stack: hardware, operating system, and storefront. Unlike either of them, it can point to a software catalogue that completely dwarfs traditional consoles. Steam has passed roughly 120,000 titles in total and continues to add tens of thousands of new releases each year. By comparison, PlayStation 5’s native catalogue is roughly 7,000 titles (and Switch is around 11,500).</p>
<h3 id=game-price>Game price</h3><p>This catalogue advantage is reinforced by a persistent pricing gap. AAA titles routinely launch at US $59.99 on Steam versus US $69.99 on PlayStation and Xbox, a US $10 (AU $15) difference. Beyond launch pricing, the gap widens further over time where Steam's seasonal sales routinely hit 70 to 80 percent discounts, compared to the 40 to 50 percent reductions typical on PSN. PC players can also shop across legitimate third-party key sellers like Fanatical, GreenManGaming, and Humble, a layer of price competition that has no console equivalent. Stack all of that on top of the fact that PlayStation Plus is required for online multiplayer in most paid titles, costing at least US $79.99 per year, or AU $102.95 in Australia, while Steam charges nothing for the same access, and the total cost of ownership gap between the two ecosystems becomes significant over a typical console generation.</p>
<h3 id=customer-experience>Customer experience</h3><p>There is also a qualitative angle that consoles struggle to match. Steam has a standardised refund policy that lets users return almost any game within 14 days of purchase as long as playtime is under two hours, a customer friendly framework which has been widely covered as a turning point in digital consumer protection. That dramatically lowers the risk of experimenting with new titles or indie games. Add to that pervasive mod support in thousands of titles, full backward compatibility for purchases dating back more than a decade, and broad support for third party controllers, headsets, and displays, and the Steam Machine begins to look less like another console and more like a consolidation endpoint for an entire hobby.</p>
<h2 id=playstations-problem-is-structural>PlayStation’s problem is structural</h2><p>The prediction here is straightforward. Across its hardware footprint, including Steam Deck, Steam Machine, and third party SteamOS devices, Steam will take a meaningful bite out of PlayStation’s active user base over the next three years. Not because Sony suddenly makes bad exclusives, but because the economic logic of engagement shifts in favour of the value proposition laid out above.</p>
<p>On one side, you have a closed ecosystem where online play for most games sits behind a subscription, the catalogue is constrained, and the small number of exclusive titles increasingly appear on PC with a lag. On the other, you have a platform that offers a much larger library, aggressive discounting, no platform level multiplayer fee, and a friendly refund policy anchored by a two hour and fourteen day window. For the segment of players who buy hardware primarily for broad access to content and price performance, rather than brand loyalty, the centre of gravity is moving.</p>
<h2 id=xbox-is-already-dead>Xbox is already dead</h2><p>The other pillar of the console triopoly is not really a pillar any more. It is an asset being managed for decline.</p>
<p>First came Xbox's revenue freefall. Then came the leadership reset. Phil Spencer, the public face of Xbox for over a decade, retired earlier in February. Sarah Bond, widely seen as his natural successor, resigned instead. Microsoft installed Asha Sharma, a rising executive from its CoreAI division with limited gaming background and a relatively short tenure at the company, to take the top job in the gaming unit.</p>
<p>Seamus Blackley, one of the original architects of the first Xbox, was blunt in his assessment. In a widely circulated interview, he argued that Xbox, like other non core AI lines of business inside Microsoft, is being “sunsetted”, and described Sharma’s task as comparable to that of a “palliative care doctor who slides Xbox gently into the night”, in comments quoted by GamesRadar’s summary of his remarks.</p>
<p>For Valve, this is a gift. As Xbox recedes as a hardware competitor and shifts to a platform agnostic distribution strategy, almost all of its future output is guaranteed to land on Steam anyway. The living room console market becomes a two and a half player race: Sony, Nintendo, and a Valve backed PC ecosystem that inherits every Xbox title as a matter of course. When the Steam Machine arrives, it steps into a space where one incumbent is complacent and the other is quietly exiting.</p>
<h2 id=nintendo-will-be-just-fine-and-probably-better-than-fine>Nintendo will be just fine, and probably better than fine</h2><p>Nintendo sits outside this knife fight almost entirely as a burgeoning entertainment company that is broader than a video games one, and that is precisely why its position looks robust over the next three years.</p>
<p>Underpinning incredible Switch 2 sales numbers is intellectual property that sits at an entirely different scale than its peers. Pokémon is the highest grossing media franchise globally, not just in games, with an estimated US $120+ billion in lifetime revenue. It outruns Hello Kitty, Star Wars, and the entire Marvel Cinematic Universe on that metric. Mario sits in the top ten with more than US $50 billion in cumulative revenue. These are not ordinary brands. They are cultural infrastructure.</p>
<p>That IP portfolio feeds a very specific demographic flywheel. Nintendo’s core audience is families with children in the three to twelve age bracket, plus adults in their mid twenties to forties who grew up with Nintendo hardware and now have both disposable income and kids of their own. Those buyers are not cross shopping against a Steam Machine or a PS5 on technical specs. A six year old wants Mario, Pokémon, and Donkey Kong, not a PC box under the TV. The hardware simply needs to be affordable, approachable, and recognisable. It just so happens to have the best parental control system, to boot.</p>
<p>Nintendo’s long standing blue ocean strategy of competing on approachability and IP rather than raw power keeps it out of the direct line of fire as the mid core and hardcore segments realign. The Steam Machine is a problem for Sony because both will be chasing the same players who live in third party titles, play service games, and care about price performance. Nintendo is running a parallel race in which the stickiest franchises in the sector are only available on one family of devices.</p>
<p>If anything, the gradual erosion of PlayStation’s distinctiveness and the effective retirement of Xbox as a hardware platform should push more casual households toward the one console brand that still has a clear, coherent reason to exist in the living room. As the high end market fragments between Sony and Valve, Nintendo’s lane becomes more, not less, obvious.</p>
<h2 id=the-next-three-years-what-actually-changes>The next three years: what actually changes</h2><p>By 2029, the scoreboard likely looks something like this: Valve and Steam extend their lead as the default PC gaming platform, with concurrent records pushing well beyond today’s 42 million mark and monthly active users rising meaningfully from current levels, extrapolating from current record setting peaks and the ongoing growth visible in Steam’s public stats. SteamOS becomes a viable third system on the TV through a mix of first party hardware and partner devices. The Steam Machine settles into a credible role as a living room endpoint that lets players consolidate PC and console spending into a single ecosystem, particularly in Western markets.</p>
<p>PlayStation remains the number two console brand behind Nintendo globally, but enters the PS6 era with weaker exclusivity leverage and a more contested value proposition. The most price and library sensitive segment of its audience discovers that Steam offers access to most of the same games at lower effective prices and with more flexibility. In that context, daily and monthly active user growth on Sony’s side is likely to slow or stall as the current hardware cycle matures.</p>
<p>Xbox continues its transformation into a software and services business with a declining hardware footprint, cementing the idea that the third console slot no longer belongs to Xbox in any meaningful sense. It's this generation's SEGA.</p>
<p>Nintendo compounds its existing advantages. Switch 2 climbs past 40 million units sold and continues to benefit from a cadence of major releases, including new mainline Pokémon titles, that repeatedly pull in fresh cohorts of younger players. The company’s exposure to the zero sum contest between Sony and Valve remains limited, because its core customers are anchored by characters, not catalogue size or raw peformance.</p>
<h2 id=this-structural-shift-is-not-about-a-single-box>This structural shift is not about a single box</h2><p>It is about the PC ecosystem finally becoming mature and convenient enough to challenge closed consoles on their own turf. Steam Machine is the expression of that shift in the living room.</p>
<p>PlayStation’s historical moats of exclusivity and social lock in are both eroding at the margin. Xbox’s lack of a durable moat has already pushed it onto an AI centric corporate chopping block. Nintendo’s moat, by contrast, consists of intellectual property valued at hundrreds of billions in lifetime revenue and a demographic cross section that is almost impossible for a more powerful generic device to dislodge.</p>
<p>For those invested in incumbent console brands, the key question over the next three years is not whether Valve kills PlayStation. It does not need to. It only needs to siphon off enough of the audience to shift the competitive dynamics. Its user base is already bigger than PlayStation's, third-party SteamOS adoption is expanding, and every quarter that passes without a credible counter-move from incumbent console brands makes structural erosion harder to reverse...</p>
<p>"Console gaming is dead, long live console gaming." - Valve</p>]]></content:encoded>
    </item>

    <item>
      <title>Screens don&#x27;t damage brains. What&#x27;s on them does.</title>
      <link>https://skella.com.au/writing/screens-dont-damage-brains-whats-on-them-does/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/screens-dont-damage-brains-whats-on-them-does/</guid>
      <pubDate>Wed, 11 Feb 2026 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>A study from 2019 is doing the rounds again - this time recycled by Australia&#x27;s Network 10 with the headline &quot;Chilling Warning For Parents As MRI Scans Show Phones Are Damaging Kids&#x27; Brains.&quot;</description>
      <content:encoded><![CDATA[<p>A study from 2019 is doing the rounds again - this time recycled by Australia's Network 10 with the headline "Chilling Warning For Parents As MRI Scans Show Phones Are Damaging Kids' Brains."</p>
<p>It sounds alarming. It's supposed to. Their clickbait aside, the reality is far more nuanced - and far more useful.</p>
<h2 id=the-study>The study</h2><p>Dr. John Hutton and colleagues at Cincinnati Children's Hospital published a paper in JAMA Pediatrics in which they scanned 47 preschoolers (aged 3-5) using diffusion tensor imaging - a type of MRI that measures white matter integrity. White matter is essentially the brain's wiring: it connects regions responsible for language, literacy, and executive function.</p>
<p>They found that children with higher screen use had lower white matter integrity in several key brain tracts.</p>
<p>The study doesn't prove causation, yet here's the critical failure:</p>
<h2 id=there-was-no-distinction-between-screen-activities>There was no distinction between screen activities</h2><p>This is where the study falls apart for anyone trying to apply it practically. The researchers made no meaningful distinction between:</p>
<ul>
<li>A child passively watching random YouTube clips</li>
<li>A child using an interactive educational app</li>
<li>A child on a video call with grandparents</li>
<li>A child engaging with age-appropriate, curriculum-aligned content</li>
</ul>
<p>All of it was measured as a single composite score under "screen time."</p>
<p>Think about how absurd this would be in any other medium. Imagine a study on "book time" that made no distinction between one child reading Possum Magic, another reading adult content, and another child flipping through numbers in a phone book from the 1990s - then concluded that books are bad for development. You'd rightly question the methodology.</p>
<p>The medium isn't what really matters - the content is. That is what should be considered and curated, and like most else in life, be consumed in moderation - from time in the sun, through to gameplay.</p>
<h2 id=what-happens-when-researchers-do-differentiate>What happens when researchers do differentiate</h2><p>When studies actually separate passive from interactive screen use, the results look very different.</p>
<p>A 2021 study in Frontiers in Education found that active, interactive screen use - touchscreen apps, educational games - had neutral or even positive effects on young children's phonological memory, a foundational skill for reading. The negative associations appeared specifically with passive consumption.</p>
<p>Research from Swinburne University of Technology concluded that the quality of a child's screen experience matters more than the quantity of time spent. Co-viewing with a parent, interactive content, and age-appropriate material all shifted outcomes in a positive direction.</p>
<p>A 2023 Australian Catholic University study found that not all screen time is harmful, and that content, context, and caregiver involvement are the real variables driving outcomes.</p>
<p>And then there's Sesame Street - arguably the most studied screen-based educational program in history. A meta-analysis spanning 15 countries consistently shows positive learning outcomes for children who watch it. That is screen time. It is also demonstrably good for kids.</p>
<h2 id=the-right-questions>The right questions</h2><p>The blanket statement "screens are bad" is lazy, reductive, and ultimately unhelpful. It doesn't give parents or professionals anything actionable.</p>
<p>The better questions are:</p>
<ul>
<li>What is the child doing on the screen?</li>
<li>Is the content age-appropriate and intentional?</li>
<li>Is a caregiver involved in the experience?</li>
<li>What is the screen replacing? If it's displacing conversation, reading together, or physical play - that displacement is the problem, not the screen itself.</li>
</ul>
<h2 id=the-bigger-issue>The bigger issue</h2><p>Studies like Hutton's serve a purpose. They open lines of inquiry. But when they're stripped of context, repackaged with fear-driven headlines six years later, and amplified by algorithms - they stop being science and start being propaganda.</p>
<p>If we want to have an honest conversation about children and technology, we need to move past "screen time" as a single variable and start talking about screen quality. Looking back through scapegoats of history and the "Sisyphean Cycle of Technology Panics," the printing press was never the problem, nor were comic books, rock & roll, video games, or "screens"... The content always was.</p>]]></content:encoded>
    </item>

    <item>
      <title>The internet is going back in time, and that’s a good thing</title>
      <link>https://skella.com.au/writing/the-internet-is-going-back-in-time-and-thats-a-good-thing/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/the-internet-is-going-back-in-time-and-thats-a-good-thing/</guid>
      <pubDate>Wed, 05 Jul 2023 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>The internet</category>
      <description>With drastic changes sweeping across networks including Twitter and Reddit, the social web as we know it appears to be coming to an end. Most recently, Twitter has limited how many Tweets you can scroll through in your…</description>
      <content:encoded><![CDATA[<p>With drastic changes sweeping across networks including Twitter and Reddit, the social web as we know it appears to be coming to an end. Most recently, Twitter has limited how many Tweets you can scroll through in your feed without paying. Meanwhile, over at Reddit thousands of subreddits remain inacessible, in protest against the company's new management and their monetisation plans.</p>
<p>Whether you agree with those changes or not, whether they will prove to be the right thing for those companies or not, the question being asked by many people leaving those networks is "where are we supposed to hang out now?"</p>
<p>It wasn't so long ago that the internet was a place of genuine exploration, with not just half a handful of monopolistic destinations, but many smaller ones instead. The places we would discover weren't plagued by the programmatic promotion of polarising content to maximise engagement. They weren’t compromised by the corrosive business models of Twitter and Facebook, through which it has become clear that paid voice + algorithmic feeds are measurably bad for societal wellbeing.</p>
<p>Community forums, IRC, and good ol' fashion email inboxes are a few of the kinds of places I'm talking about. One might think that new innovations would be the new social web destinations, but instead these familiar-looking kinds of places are where so many people have arrived at. Over the past years we've seen an immense resurgence in email newsletters and a rapid rise in Discord user numbers, the IRC-like app of today, both of which are expected to be catapulted even further, in light of recent social web events. Nostalgically, even the shutdown of 25-year-old DPReview was avoided - a bustling community that is home to almost 50 million forum posts which now live on.</p>
<p>The modern web feels increasingly broken, and these latest changes by some of those social giants make it easy to understand why: a typical Google search result isn't as useful as it once was, dominated by paid links, and others to sites that you can't even view. From news site links with paywalls, to Tweets being account walled, and so many subreddits closed to the public, even the value of search engines have been eroded as a consequence. While some people remember the days when search engines were not very useful, this is a new and abrupt reality for a lot of people. Critically however, we are not just mindlessly opening accounts with every walled off site a search engine links us to - we’re also looking elsewhere.</p>
<p>As a result of this reality and this behaviour, the size and power of any individual network is set to diminish, with their controversial changes stoking our willingness to consider spreading out again... inadvertently, to something that looks more like a version of the internet from a time past. What we’re witnessing is the re-fragmentation of the social web.</p>
<p>A move back to the IRC-like interaction of Discord, along with email inboxes for our "feed" of news, may seem like a regression, yet through this redistribution of our attention we are reclaiming our individual independence from behemoth aggregators. We are reclaiming our agency, in many cases our privacy, and in some cases our sanity.</p>
<p>This apparent disaggregation of the social web may leave some people feeling a little lost. Especially those too young to remember how the internet functioned as a tool for discovery and connection two decades ago. Yet portions of the population at Twitter and Reddit heading for the exits is not a sign of us giving up on the social web. It's a sign that we don't actually need or want the latest version of it. Many of us are beginning to realise that alternative and sometimes older paradigms for conversation and consumption are the better ones, after all.</p>
<p>If you're interested in what my most important newsletters are right now, I've left a short list below. I’ll catch you on Discord to discuss their next editions...</p>
<ul>
<li>Techmeme</li>
<li>Hacker News</li>
<li>Bloomberg's Australia Briefing</li>
<li>Future Crunch</li>
<li>A personalised digest via the power of Mailbrew</li>
</ul>]]></content:encoded>
    </item>

    <item>
      <title>How GPT-4 will bring &quot;Her&quot;​ to life</title>
      <link>https://skella.com.au/writing/how-gpt-4-will-bring-her-to-life/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/how-gpt-4-will-bring-her-to-life/</guid>
      <pubDate>Tue, 21 Mar 2023 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Agentic systems</category>
      <description>Cinema has long been a source of inspiration for technological advancements. Visionaries like Stanley Kubrick and Steven Spielberg have presented us with countless depictions of futuristic societies, many of which have…</description>
      <content:encoded><![CDATA[<p>Cinema has long been a source of inspiration for technological advancements. Visionaries like Stanley Kubrick and Steven Spielberg have presented us with countless depictions of futuristic societies, many of which have gone on to shape the course of innovation in the real world. Today, we stand at the precipice of yet another extraordinary leap, as the convergence of artificial intelligence, micro-electronics, and voice synthesis brings the 2013 film "Her" to life.</p>
<p>In Spike Jonze's critically acclaimed movie, the protagonist Theodore Twombly, portrayed by Joaquin Phoenix, falls in love with his AI-powered personal assistant, Samantha. While the romantic aspect of the film may be a stretch, the concept of an intelligent, voice-driven assistant that integrates seamlessly into our daily lives is rapidly becoming a reality. By combining the power of GPT-4, an advanced large language model, with cutting-edge micro-electronics and Deep Voice technology, we are on the cusp of a new era in personal assistance.</p>
<h2 id=the-large-language-model-revolution>The Large Language Model Revolution</h2><p>The latest generation of AI language models, exemplified by OpenAI's GPT-4, has ushered in next-level interaction between humans and machines. These models can interpret context, generate coherent responses, and even produce creative content. In fact, GPT-4 wrote this entire article based on my simple prompt [Editor's Note: gasp, plot twist! Find the full prompt I wrote to generate this short piece of writing in the conclusion of this newsletter]. As a result of its own capability, GPT-4 has the potential to subsequently redefine the power of personal assistants, making them more capable of engaging in meaningful conversations than ever before.</p>
<h2 id=micro-electronics-meets-ai>Micro-Electronics Meets AI</h2><p>One of the key factors enabling the seamless integration of AI assistants into our daily lives is the rapid miniaturization of electronics. Devices like Apple's AirPods and Amazon's Echo Buds are a testament to the incredible advancements in micro-electronics, enabling users to access digital assistants with just a tap or a voice command. By combining GPT-4 with these tiny, wireless earbuds, we can create an unobtrusive, always-accessible interface for communication with our AI companions.</p>
<h2 id=the-power-of-deep-voice>The Power of Deep Voice</h2><p>Another crucial element in bringing "Her" to life is the development of sophisticated voice synthesis technology. NVIDIA's Deep Voice, for instance, has made it possible to generate highly realistic, human-like voices for AI applications [Editor's Note: or, more recently than GPT-4's training data cut-off of 2021, the release of the staggeringly believable Spotify DJ]. By merging this technology with GPT-4-powered personal assistants, users can engage in natural-sounding conversations with their AI counterparts, much like Theodore and Samantha in the movie.</p>
<h2 id=a-future-just-around-the-corner>A Future Just Around the Corner</h2><p>With these rapid advancements in AI, micro-electronics, and voice synthesis, the vision presented in "Her" is no longer a distant sci-fi fantasy. Instead, it has become a matter of which company will assemble these pieces in the most elegant way the soonest. As the likes of Google, Apple and Amazon race to refine their virtual assistant technologies, there is no doubt we can expect a future where AI companions are as ubiquitous as smartphones, making our lives more connected, efficient, and even more enjoyable than ever before.</p>]]></content:encoded>
    </item>

    <item>
      <title>The Next Now: First Trump, now Murdoch, banned from Facebook</title>
      <link>https://skella.com.au/writing/first-trump-now-murdoch-banned-from-facebook/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/first-trump-now-murdoch-banned-from-facebook/</guid>
      <pubDate>Thu, 18 Feb 2021 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>One might joke that another major source of misinformation has just been banned from Facebook: Murdoch’s Australian news media empire. Others might say it with a plain face. Can you blame them? The dumpster fire of…</description>
      <content:encoded><![CDATA[<p>One might joke that another major source of misinformation has just been banned from Facebook: Murdoch’s Australian news media empire. Others might say it with a plain face. Can you blame them? The dumpster fire of clickbait and stories such as “Reality star’s next-level underboob look” - yes, that’s a real news.com.au story - certainly hasn’t helped increasingly strong opinions that modern news media no longer serves a positive social purpose.</p>
<p>Today’s move by Facebook to ban the sharing of links to Australian news sources is extreme and outrageous... according to news sources. They’re very angry at Facebook. Strangely, they’re not angry at the Australian government for proposing such embarrassingly internet-illiterate measures in the first place.</p>
<p>Let’s break down four realities to assist readers who remain on the fence about this developing situation:</p>
<ol>
<li>The government wants Facebook to pay news outlets whenever their content is linked to. Not for ‘using’ or republishing it - merely when anyone links to it, which is in fact of commercial benefit to those news organisations already. In some cases, news orgs can thank Facebook for 20-30% of their traffic (and traffic is how they make money), so this seems like an almost-devious attempt to double-dip.</li>
<li>The government isn’t saying anyone else should get paid to be linked to - only news organisations. They’re dictating that it’s just ACMA registered news sources who get paid; only news outlets who already make hundreds of thousands or more in revenue. I won’t get paid by Facebook every time I link to this ‘newsletter’. Should I be? Through extrapolation, should Facebook also pay Google if I link to a YouTube video on Facebook? None of this, or any variation of it, actually makes sense. That’s how you quite literally dismantle “the web”.</li>
<li>To reiterate, the government isn’t saying we should tax Facebook to support the media as it serves a positive social purpose, whose revenues have been hit by advertising moving to social media. That would be a far less intellectually dishonest route. Instead, based on what boils down to flawed intellectual property arguments, the government is attempting to set a precedent that the news, and only the news, deserve to be paid whenever their content is so much as linked to.</li>
<li>Politicians and media executives are suggesting that without their news on Facebook, it becomes a cesspool of fake news, that it’ll become a source of misinformation, that it becomes a cancer amidst society... They obviously haven’t been on Facebook lately. It already is, with or without them. Please stop using Facebook.</li>
</ol>
<p>Pretending that an ‘everyone must now pay to link’ model didn't break the internet, the fundamental question you must ask yourself is: why would Facebook want to pay for these links? For something that is not materially important to them? You might say it’s worth a lot to them. Newsflash: it’s not - they just canned it without batting an eyelid. There have been no sound arguments and no defensible numbers put forward anywhere, by anyone, to legitimately argue an alternate financial reality. To put it another way: if given the choice of paying for something that would have no return-on-investment, or turning off that something to avoid losing that money, which are you choosing? After all, Facebook’s board has a fiduciary duty to their shareholders to make money and create shareholder value, not the opposite.</p>
<p>That is the choice that the Australian government presented to Facebook, by way of attempting to mandate a dangerously tech-naive proposal with an industry-bias they haven’t well explained. This is clearly a nation state’s attempt to bully a wealthy business into offering a handout. A handout to a mate named Rupert, is what I would say if I wanted to stir the pot… which I just have.</p>
<p>Brass tacks: if the government’s proposal wasn’t ultimately just seeking money-for-nothing for the news media, they would have made the [even more insane] argument that every content creator should be paid when they’re linked to; that as a by-product the fabric of the internet should shift to one where only the wealthy can afford to link to information in large volumes, or let their users link to information... Thus, wealthy institutions then control the flow of information even more than those wearing tinfoil hats insist they do right now.</p>
<p>Today’s unnecessary outcome is going to hurt Australian news media revenues more than the handouts would have benefited them. News outlets should not be very angry at Facebook. They should be very angry at the Australian government. I’m irrationally hopeful that this becomes a catalyst for the maturing of Australia’s political representation with respect to technology policy. Maybe, just maybe, we’ll see some more tech literate individuals moving through the ranks in the near future, for all of our benefit.</p>]]></content:encoded>
    </item>

    <item>
      <title>The Next Now: the internet vs the mega rich</title>
      <link>https://skella.com.au/writing/gamestop-the-internet-vs-the-mega-rich/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/gamestop-the-internet-vs-the-mega-rich/</guid>
      <pubDate>Fri, 29 Jan 2021 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>The internet</category>
      <description>If you’re reading this newsletter, you’ve probably at least caught wind of GameStop’s (NYSE: GME) stock hitting $347, rising from $17 just a week ago. Before we talk about the implications of this, let’s look at how we…</description>
      <content:encoded><![CDATA[<p>If you’re reading this newsletter, you’ve probably at least caught wind of GameStop’s (NYSE: GME) stock hitting $347, rising from $17 just a week ago. Before we talk about the implications of this, let’s look at how we got here.</p>
<p>GameStop, objectively, is a dying business. For those not familiar with GameStop, you might be familiar with the EB Games brand, which is one and the same. The business model in question is becoming increasingly irrelevant, and this is happening quite fast. It will almost surely go the way of Virgin record stores or with lesser direct relation, Blockbuster. Why? Microsoft and Sony are pouring hundreds of millions into a focus on subscription based digital distribution for their video games consoles. On PC, buying a physical game on a disc has long been passé. I’ll save a deep dive on games distribution for the next unavoidable controversy on that topic, but it’s important to know that without a stark business pivot - one that will leave GameStop looking like a very different company - it’s not one many people would recommend you invest in.</p>
<p>So with that in mind, how did GameStop just reach a market cap north of $24 billion, seeing 1700% gains in such a tiny window of time?</p>
<h2 id=sticking-it-to-the-man>Sticking it to the man</h2><p>Fundamentally, for not much other reason than a middle finger to ‘the establishment’. On January 11 GameStop appointed three new directors to its board. As a result, optimism for the company’s future was renewed amongst a small number of bulls, and the price saw a positive bump. Savvier hedge funds understood the company’s ultimate fate and that it’s unlikely a few new directors will turn it around. So, they decided to bet against it; they shorted it. That is to say, they leveraged a mechanism for making a ton of money as GameStop’s share price falls, with the seemingly trivial risk of the shares going much further up - in which case, they would lose in big ways. However, numerous voices would argue that these aggressive short positions, which in this case accounted for more stock than existed, are reflective of a broken system.</p>
<p>Reddit would agree, it has become abundantly clear, where a notoriously hilarious and noisy WallStreetBets trading community, otherwise known as WSB, decided they didn’t want to let billionaires profit greatly by betting on a catastrophic end to GameStop’s operation. Instead, they rallied together to pump up the GameStop price to spite those hedge funds. The theory underpinning their willingness to do so is that short-sellers will have to then begin buying to cover their losing bids, or risk going bankrupt. The result would be a further driving up of the price, with all of those Redditors profiting. Where would that end, though, and when? More on this soon.</p>
<p>How can such pressure be sustained? How can ‘the internet’ raise a failing company’s value by $17 billion - irrationally so - and then sustain that irrationality longer than the period of time these hedge funds can stay liquid? It has long been suspected WSB is quite literally influencing the market, however this is the first time it has been proven true in no uncertain terms. The first thing you need to understand is that these aren’t just kids with pocket money trading via zero-fee platforms such as Robinhood. Ironically, amongst the ranks of WSB are the likes of bitcoin multi-millionaires (billionaires?). The second is that cryptocurrency magnates or not, WSB is in essence a community of get-rich-quick gamblers, opportunistic day traders, anarchists with a bent for systemic change, or any combination of those labels… there’s a cohort amongst WSB who will happily risk their money to ‘stick it to the man’ and that seems to be a meaningful part of what has happened here. To be clear, I say all of these things, in this case, in the most endearing ways possible. It’s one of my most enjoyed communities online, if for no other reason than the memes.</p>
<p>To the surprise of many, that community - soon joined by vast volumes of gamers and retail investors around the world experiencing FOMO - got some wins on the board. This ‘short-squeeze’ on hedge funds has been estimated to cost them as much as $15 billion. Melvin Capital admitted defeat, closing their positions, accepting huge losses, and required a $2.5 billion bailout from Citadel Investment Group to stay afloat.</p>
<p>Adding fuel to fire, Robinhood then halted the ability for the community to continue to keep buying GameStop, suppressing the price growth and single handedly being a catalyst for killing this trading momentum. Unlikely to be mere coincidence, Robinhood takes money from Citadel to execute their own order flow, and Citadel is an owner of Melvin capital. Class action lawsuits against Robinhood have already been filed.</p>
<p>As right as these hedge funds might have ended up being that the value of GameStop was going to fall, they don’t get to decide. The fact that traders had their ability to participate in the purchase of GameStop in the other direction halted should be cause for serious concern.</p>
<h2 id=what-happens-next>What happens next?</h2><p>As pointed out by the brilliant Matt Levine in his Bloomberg newsletter, Money Stuff, there are a few ways this ends for people who have put money into it. First, bulls suddenly believe GameStop can become an industry behemoth, justifying the meteoric rise, maintaining this new valuation. An alternative, perhaps hedge funds everywhere are sent bankrupt, the entire market is a meme, and capitalism is changed forever. Another is that funds now assist in buying - thus continuing the price pump - to cover losses on their shorts, yet this can’t continue forever and new bets against GameStop’s price continuing to rise are surely happening already.</p>
<p>Among other ends, the most plausible is the one I’ve been warning fellow gamers and retail investors of: that the price is on the verge of collapse; people who got on the train early are ready to cash out of a business they don’t really believe in and a stock price they know can’t be maintained. Chanting “HODL” only works for so long, at some point every holder wants to realise their gains.</p>
<p>The more interesting questions about what happens next arise when considering the implications for markets in much broader terms. The unprecedented scale and exposure of this event is turning out to be quite profound. This has rattled confidence in the market, questioned the underlying integrity of it, and - if we’re really lucky - will change investment behaviour for the better.</p>
<p>Do you remember how financial giants have taken great risks and created fragility events for entire economies, sending countries into recessions? Do you remember 2008, when the poor suffered and the wealthy were bailed out? Now, retail investors have taken great risks and created a fragility event for the wealthy. The wealthy are not pleased. Some are calling for the rules to be changed to prevent it from happening again. Crystalising at the core of this conversation are themes about what is fair and how free our free market really is, should be, or might not be. This entire ordeal is a better lesson in financial literacy than almost anyone ever got during their schooling.</p>
<p>A debate about investment ethics and market mechanics is now being voiced on a stage never so large and never so public. It is being paid attention to by more people than have ever had an interest in the stock market before. Importantly, by young people. The importance of this debate, and the outcomes that arise from it, for the benefit of personal wealth generation and the shaping of our future economies, cannot be overstated.</p>]]></content:encoded>
    </item>

    <item>
      <title>The Next Now: silicon, superpowers, and the suppression of globalisation</title>
      <link>https://skella.com.au/writing/silicon-superpowers-and-the-suppression-of-globalisation/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/silicon-superpowers-and-the-suppression-of-globalisation/</guid>
      <pubDate>Mon, 04 Jan 2021 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>China&#x27;s time-to-rise has just been compressed. Overtaking the USA as the world&#x27;s global economic superpower - clearly, instead of arguably - may now be just years away, instead of decades, for two key reasons:</description>
      <content:encoded><![CDATA[<p>China's time-to-rise has just been compressed. Overtaking the USA as the world's global economic superpower - clearly, instead of arguably - may now be just years away, instead of decades, for two key reasons:</p>
<ol>
<li><p>The USA and UK cut off China's ability to acquire advanced silicon (the 5-10 nanometre chips used in smartphones and many modern devices), all but killing Huawei's (and others') ability to operate in the consumer electronics industry. They've also blocked Chinese businesses from building or installing electronics requiring less-advanced silicon, such as telecommunications infrastructure. This has forced China to intensify investment in advanced manufacturing locally, meaning they'll no longer rely on US-based fabrication in ~5 years, instead of ~15+.</p>
</li>
<li><p>China locked down with grand effectiveness during their response to COVID - then uniquely rebounded to growth within the same year - while the Trump administration continues to stagger shambolically. Adding salt to wound, a significant statement that further dismisses the relevance of the US is the newly signed European Union-China investment agreement, which bilaterally boosts market access. China is now poised to deliver at least one-third of global growth year-on-year ahead.</p>
</li>
</ol>
<p>America's anti-China sentiment and related political agenda appears to be backfiring in ways so numerous and so profound that it's very hard to predict the extent of consequence.</p>
<p>What does seem predictable when considering the above points, the addendum of brexit, and a footnote of COVID (let alone AU-CN trade woes), is that the world may well be halting the multi-decade march of globalisation. It has been a political, social, and economic march that has established remarkable divisions of labour across borders, creating unprecedented efficiencies, access, and abundance. The net result has been lifting billions of people out of extreme poverty, enhanced cultural cohesion, and made the world the best place it has objectively been in history, overall.</p>
<p>Many people alive today have only experienced unrelenting progress, afforded to them as a result of, by and large, globalisation. Now, however, the most populous nation on the planet is on the verge of no longer needing anyone else. We forced their hand. Yet, no one else has the means to replace what they need China for - at least with respect to the economies of scale now taken for granted. After all, no one would be particularly pleased with an American made iPhone for USD $3,000.</p>
<p>If the writing on the wall in 2020 turns out to accurately foretell the retreat to more insular, nationalist, self-sufficient strategies around the globe, one where China holds the strongest cards - from silicon, to energy, “AI”, quantum, and beyond - then awkward and uncomfortable shifts for many of today’s strongest economies and biggest companies cannot be avoided.</p>
<p>If you’re not already thinking about what your business model, investment thesis, or go to market strategy might need to look like in 2025, you best start right now.</p>
<p>Happy new year - it’s going to be one of the most interesting ones yet.</p>]]></content:encoded>
    </item>

    <item>
      <title>The Next Now: Privacy and China</title>
      <link>https://skella.com.au/writing/privacy-and-china/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/privacy-and-china/</guid>
      <pubDate>Mon, 20 Jul 2020 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>The internet</category>
      <description>##TikTok&#x27;s looming ban could spell the start of the end of the internet as we know it. Launched by Beijing-based company Bytedance in 2018, the app claims 800 million monthly active users, with about 1.6 million…</description>
      <content:encoded><![CDATA[<p>##TikTok's looming ban could spell the start of the end of the internet as we know it.
Launched by Beijing-based company Bytedance in 2018, the app claims 800 million monthly active users, with about 1.6 million Australian users. TikTok patched every vulnerability they were notified of in November by December 15, yet bans look imminent despite this. Of course, vulnerabilities are discovered in all of our favourite software and hardware, even Intel processors. Beyond TikTok's patched vulnerabilities, one could argue there isn't actually much to be alarmed about here, unless you also want to be alarmed by the practices of Facebook and Google. Even if TikTok was to become provably fully compliant with the best security practices, TikTok seems it will fall victim to geopolitics moreso than engineering oversights. You see, the TikTok alarm has shifted from being about vulnerabilities to alarms that sound much more like those raised about Huawai: rapidly growing yet intangible security concerns that Chinese companies will be subjected to the demands of their government against our own national interests. With many countries in the west establishing what are effectively their own versions of the Great Firewall of China, it feels as though the globalising power of the internet is being wound back. This will have significant cultural and economic ripple effects. What will it mean to be a global technology company in 10 years? Will such a thing even be practical?</p>
<h2 id=britain-has-banned-huawei-from-its-5g-network-and-will-remove-existing-equipment-by-2027>Britain has banned Huawei from its 5G network and will remove existing equipment by 2027.</h2><p>From bad to worse for the Chinese telecoms equipment juggernaut. In August 2018, Australia became the first country in the Five Eyes intelligence network to ban Huawei from involvement in the country's 5G network. New Zealand and the US followed, with Britain being the latest to join that list. The anti-Huawei movement continues, with unavoidable yet hard to predict economic and political consequences. Like the shortsighted and continually sub-optimal NBN fibre rollout in Australia, a delayed 5G network in Britain may not just cost taxpayers more now, but delay or offset in perpetuity the economic benefits that arise out of better business taking place - and brand new businesses being able to take shape - on the back of such infrastructure. This isn't to say Huawei's ban isn't in fact warranted, yet beyond the hotly contested accusations of Huawei backdoors, the common theme here is political in nature: the measures we're seeing unfold have little to do with technology vulnerabilities, which can be identified and addressed, yet much more to do with technology provenance, which can not.</p>
<h2 id=this-app-would-like-permission-to-track-you-across-apps-and-websites-owned-by-other-companies>"This app would like permission to track you across apps and websites owned by other companies."</h2><p>Is that a notification on your smartphone that you would agree to, or would you tap the 'deny' button for that request? Expect to see that exact message a lot when iOS 14 lands on your iPhone. Apple is getting increasingly aggressive with anti-tracking across their ecosystem, because they can. Unlike Amazon (a commerce company) and Google (an advertising company), Apple's model means it can champion privacy - shaming those who don't as a byproduct - because Apple's core business is genuinely about their hardware and services. In the world of Amazon and Google their devices exist only to sell you things. Either their own things, or someone else's. The more ubiquitous their presence in your life and the more they know about you, the more profitable they have the potential to be. Apple cares less about your identity and more about your loyalty - they need you to buy their next device and subscribe to iCloud, buy the odd app, and maybe even use Apple Music... that's enough. This puts them in an incredible position of leverage, where they can overtly call-out common practices that other technology companies rely on; practices that make many consumers squeamish, rightly or wrongly. This is only going to harden Apple loyalty, while shaking up online advertising and ecommerce, given Apple's approximate 50% smartphone market share in some of the countries that spend the most online - places like the UK, Canada, Australia and USA. In our current global climate, privacy is becoming the 'killer app of life' and Apple is who makes it. With a closed, tightly controlled ecosystem and without a reliance on ad money or the sale of user data, they might be the only major tech company who can make it.</p>]]></content:encoded>
    </item>

    <item>
      <title>It&#x27;s time to rethink your tablet strategy</title>
      <link>https://skella.com.au/writing/its-time-to-rethink-your-tablet-strategy/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/its-time-to-rethink-your-tablet-strategy/</guid>
      <pubDate>Thu, 18 Jun 2020 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Consumer technology</category>
      <description>Now is the time to come to terms with the fact that tablet&#x27;s not a part of our mobile future. They’re a part of our PC past and you should be thinking about any plans you have for them through that lens.</description>
      <content:encoded><![CDATA[<p>Now is the time to come to terms with the fact that tablet's not a part of our mobile future. They’re a part of our PC past and you should be thinking about any plans you have for them through that lens.</p>
<p>In January of 2015 Gartner reported that the PC market was poised for growth after 2 years of decline. “The PC market is quietly stabilising after the installed base reduction driven by users diversifying their device portfolios. Now that tablets have mostly penetrated some key markets, consumer spending is slowly shifting back to PCs,” said Mikako Kitagawa, principal analyst at Gartner.</p>
<p>Instead, PC shipments declined 5.2% in the first quarter of 2015, then staggeringly, another 9.5% in the second quarter. After an eventual 7 years of straight decline, 2019 was finally that year of growth: 0.6 percent, which wasn't to last. PC sales suffered the worst decline since 2013 in Q1 2020, compounded by the supply chain and consumer demand implications of COVID-19.</p>
<p>This trend of decline since Gartner's prediction of turnaround wasn't because tablet penetration was soaring, either. Tablet sales have also been in decline with worldwide shipments continuing to fall. The tablet market shrank a further 1.5% in 2019, compared to 2018, with 144 million units shipped globally - a little over half that of PC shipment numbers. This decline would have looked far more drastic without iPad sales propping up those numbers.</p>
<p>The argument that upgrade cycles are longer for tablets than smartphones is valid - the lack of demand to replace existing tablets has indeed long been identified as one of the main constraints for growth - but that’s precisely the point. Consumers are upgrading their tablets in cycles similar to PCs due to the role they play in their personal computing lives — which is increasingly a less important one.</p>
<p>Lower prices and better features can't revive these devices because they have become fundamentally non-essential for large portions of consumer populations. There’s no desire or need for people to frequently upgrade cumbersome machines for things they can do on the small ones that are always with them and always connected, being upgraded every 24 months on teclo contracts.</p>
<p>If it’s hard to imagine that most people don’t need a PC or a tablet at home, or at least not use it regularly enough to bother upgrading it, it’s important to remind yourself that the needs of a typical consumer are far different to many of you reading this article. Smartphones are now at a point in application maturity, screen size and hardware performance to meet and surpass the wants and needs of the average consumer. Messaging, gaming, reading, banking, writing and even media consumption.</p>
<p>To really understand who that average consumer is and what their personal computing needs might be, all you have to do is look at what the top 25 occupations are in the USA, by volume of employees.</p>
<ol>
<li>Retail Salespersons</li>
<li>Cashiers</li>
<li>Office Clerks</li>
<li>Combined Food Preparation and Serving Workers, Including Fast Food</li>
<li>Registered Nurses</li>
<li>Customer Service Representatives</li>
<li>Waiters and Waitresses</li>
<li>Secretaries and Administrative Assistants</li>
<li>Janitors and Cleaners</li>
<li>Laborers and Freight, Stock, and Material Movers</li>
<li>General and Operations Managers</li>
<li>Stock Clerks and Order Fillers</li>
<li>Bookkeeping, Accounting, and Auditing Clerks</li>
<li>Heavy and Tractor-Trailer Truck Drivers</li>
<li>First-Line Supervisors of Retail Sales</li>
<li>Sales Representatives,</li>
<li>Nursing Assistants</li>
<li>Maids and Housekeeping Cleaners</li>
<li>First-Line Supervisors of Office and Administrative Support Workers</li>
<li>Elementary School Teachers</li>
<li>Maintenance and Repair Workers, General</li>
<li>Childcare Workers</li>
<li>Accountants and Auditors</li>
<li>Teacher Assistants</li>
<li>Personal Care Aides</li>
</ol>
<p>I won’t bore you with the next 25, which is similarly abstracted from the computer use many readers here would be accustom to. Most people aren’t developers, authors or video editors. Many of the roles in that list don’t need big screens or powerful hardware, especially not at home where they often wouldn’t carry on doing their day job in any capacity - although the increasingly common shift to remote work for many companies, in light of Coronavirus, clouds this matter considerably.</p>
<p>All of this said, tablets aren’t dead. Just like PCs aren’t dead either. Neither are mainframes. It does now look clear however, that large format computers will be resigned to increasingly narrow use cases for fewer and fewer people over time. For most everyone else, a shift in personal computing behaviour continues to reshape what personal computing actually means.</p>
<p>Almost without us noticing, smartphones have become our primary and preferred personal computers. In the circumstance of many individuals — young and old, from America to India — they’re the only personal computers we need.</p>]]></content:encoded>
    </item>

    <item>
      <title>The Next Now: work from home forever + esports mainstream moment</title>
      <link>https://skella.com.au/writing/work-from-home-forever-esports-mainstream-moment/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/work-from-home-forever-esports-mainstream-moment/</guid>
      <pubDate>Mon, 01 Jun 2020 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Games</category>
      <description>I&#x27;ve decided to open up a half-decade running monthly digest amongst peers to my broader network. A digest about what? About emerging technology and trends as they relate to the future of business, culture, and society.…</description>
      <content:encoded><![CDATA[<p>I've decided to open up a half-decade running monthly digest amongst peers to my broader network. A digest about what? About emerging technology and trends as they relate to the future of business, culture, and society. More than mere observations about what is happening now, this digest intends to ask questions about what is happening now that are designed to be a catalyst for deeper thought on what we do next; that will help us do a better job of identifying change not as a threat, but as opportunity.</p>
<h2 id=square-facebook-slack-twitter-shopify-and-more-join-the-work-from-home-forever-club>Square, Facebook, Slack, Twitter, Shopify and more join the "work from home forever" club.</h2><p>Bitter-sweet vindication for a decade of designing distributed teams that outperform centralised ones, COVID-19 has sped the adoption of a workplace revolution, leaving offices around the world emptier than before. Some, permanently so.</p>
<ul>
<li>Companies embracing distributed working environments are now empowered to hire the best talent regardless of where they reside, while simultaneously avoiding costly rents. Can your company compete for the best talent without being just as flexible?</li>
<li>What impact does this have on commercial rental prices and commercial property investment as a category if the trend continues to gain momentum?</li>
<li>In Australia we're seeing people take their Sydney salary and move to Brisbane to work remotely. We're seeing the same from San Francisco to many places with cheaper costs of living. What happens to residential rental and purchase prices in our most expensive cities?</li>
<li>In light of this new reality, the NBN in Australia is under greater stress and greater criticism than ever before. How will this affect government and private industry investment in internet infrastructure and related services? What are the implications for under-investment with respect to the strength of our future digital economies?</li>
</ul>
<p>There are many unknowns, yet one thing is certain: businesses and governments will thrive or dive based on how they decide to embrace, or resist, the workplace changes that have finally moved in metres after many years of inches.</p>
<h2 id=british-rally-joins-nascar-indy-supercars-f1-have-accelerated-a-digital-sporting-future>British Rally joins NASCAR, Indy, Supercars, F1 have accelerated a digital sporting future.</h2><p>If there's one industry positioned to benefit amidst COVID-19, it's esports. Yet it's actually a far broader ecosystem of beneficiaries than just the simulators behind those virtual racing series'. With esports and sim racing having its 'mainstream moment', it's the manufacturer of racing wheels and gaming headsets that have seen business soar and retailers struggle to keep up with demand. It's also the digital distribution platforms seeing record sales of software and developers seeing record high numbers of concurrent players. With sim racing being shockingly close to the real thing, a question of future horizons asks: why might tomorrow's society continue to risk lives in the real world when we can compete safely in virtual ones? Ready Player One doesn't seem quite as silly in the context of the late 21st century as it did just months ago, does it?</p>
<h2 id=joe-rogan-reminds-us-streaming-services-are-just-the-new-record-labels>Joe Rogan reminds us streaming services are just the new record labels.</h2><p>If you don't know who Joe is, he's one of the world's most popular podcasters. Elon Musk smoked weed on his show. That guy. It has become clear that exclusives are hotter than ever, from Ninja moving from Twitch to Mixer, and Australia's new Binge being the only standalone streaming platform in the country that you can watch Game of Thrones on. It has become clear that streaming services are merely the new record labels and cable networks - it’s fascinating how quickly exciting innovations have morphed into what are akin to digital versions of the analogues we thought we were getting away from. If consumers have again lost some power, if we now need 10 digital service subscriptions which costs just as much as cable anyway, where do we go from here? Radiohead and Louis C.K. tried direct distribution, but its expensive and limits reach. In many ways consumers are back to square one, yet without better business models so are the new distribution owners. Unfortunately for distributors the consumer alternative remains simple: piracy.</p>
<h2 id=android-is-taking-over-so-you-had-better-consider-that-with-the-release-of-your-next-app>Android is taking over, so you had better consider that with the release of your next app.</h2><p>One of the last geographical bastions of iPhone dominance is eroding: the majority of Oceania’s mobile market share has finally been seized by Android, now at 53%. For the longest period, there were more internet-engaged iOS devices than Android ones in the region. This isn't quite what it seems at face value, though. Oceania is a much bigger place than only the highly developed markets of Australia and New Zealand - it also includes many less developed markets, where incomes are an order of magnitude lower than Australia's, yet iPhone prices remain the same. Android's cheaper handsets are the reason there's no contest between iOS and Android market penetration in the world's developing nations, but this may spell trouble for Apple as we edge closer to a world where every adult has a smartphone (currently at ~5bn). North America remains squarely 50/50 iOS/Android, while Android has long maintained a large lead over iOS in Asia, India, and Europe. Of course, the smartphone wars are almost boring now. The more exciting question is what's next and it seems Apple is primed for an entrance into augmented reality with a rumoured smart-glasses release in 2021.</p>]]></content:encoded>
    </item>

    <item>
      <title>What cloud gaming is, and why it’s not the future of esports</title>
      <link>https://skella.com.au/writing/what-cloud-gaming-is-and-why-its-not-the-future-of-esports/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/what-cloud-gaming-is-and-why-its-not-the-future-of-esports/</guid>
      <pubDate>Mon, 24 Jun 2019 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Games</category>
      <description>There is little doubt that cloud will amplify access to AAA gameplay and boost the total number of gamers around the globe significantly. In many ways, we can think about cloud game streaming as the democratisation of…</description>
      <content:encoded><![CDATA[<p>There is little doubt that cloud will amplify access to AAA gameplay and boost the total number of gamers around the globe significantly. In many ways, we can think about cloud game streaming as the democratisation of video games. No longer do you need an expensive PC, or even a console such as the Sony PlayStation, to play the grandest of games. Instead, a powerful PC somewhere else runs the game, and you stream the vision to your own device - it’s kind of like having your CPU in another location.</p>
<p>Obvious benefit to casual gaming segments aside, a common thread of commentary amongst industry observers is that cloud gaming, the likes of Google Stadia, is also the future of esports. Contrary to that commentary, the truth is that cloud gaming has no direct application to the landscape of competitive play.</p>
<p>Before I get into the explanation of why that’s the case, here’s an analogy to set the scene: a Formula 1 team decides to introduce a new steering wheel. This steering wheel turns the car left and right just like everyone else’s, but it does so with a slight delay. Instead of moving the wheel and the car responding immediately, it happens after the point that drivers want it to. It's not a predictable point, either - it varies. When they’re travelling at hundreds of kilometres per hour, against people with razor sharp reflexes, this becomes an extreme handicap.</p>
<p>When you are playing Counter-Strike (or Fortnite), and you tap a button, that tap is processed in less than 1ms (if we assume a 1000Hz USB polling rate), and with all other hardware input lag variables considered, it appears on your screen within about 15-30 milliseconds. We can call this the 'button-to-pixel' delay. The same can be said about turning a Logitech racing wheel in your favourite racing sim, such as iRacing (or Gran Turismo).</p>
<p>If you are playing these games via a cloud service, your physical input is no longer just a local process. Instead, your input is first sent to a remote server. Before you see your input reflected on your screen, it needs to be processed remotely and returned via video stream. If this method of computing applied to desktop use, imagine moving your mouse now yet seeing the cursor position on your screen move a moment later, instead of immediately.</p>
<p>That additional latency is added atop the existing 15-30 millisecond button-to-pixel delay experienced on a local machine. The entire round-trip for this process, from local input, to remote rendering, and then seeing your input back on your screen, results in a delay of 60-120 milliseconds.</p>
<p>That puts you at a ~4x disadvantage in excellent-case scenarios, when the cloud server you are connecting to is close, yet often a far greater disadvantage than that. Furthermore, that degree of latency is visually obvious. While processing inputs on a local machine looks and feels instant, there is now the discernible sensation of delay when using a cloud service.</p>
<p>That delay is the difference between winning and losing. The difference between hitting the perfect apex of a corner, or careering into a wall. It is the difference between shooting first, or dying first. Restrained by the laws of physics, perhaps with the exception of mastering quantum entanglement, there isn't a scenario that exists where sending data to be processed remotely and then receiving the results locally can be as fast as simply processing locally.</p>
<p>While these minor delays between pressing a button and seeing a result is not a deal breaker for single player experiences on a screen, it is a deal breaker for single player experience in virtual reality where such delays are nausea inducing. In esports, these delays are unavoidably a competitive disadvantage.</p>
<p>When considering the detrimental implications of cloud gaming for esports, it’s critical prerequisite to appreciate that it is an individual's ability to react faster is a large part of what separates the elite from everyone else. Competitive gamers spend an untold number of hours and large sums of money on latency optimisation and response time improvements - everywhere from their network configuration to their choice of monitor. In a game of inches, everything about what you use to play the game matters. Just like in F1, it doesn’t matter how good a driver you are if you’re in the worst car.</p>]]></content:encoded>
    </item>

    <item>
      <title>Designing better experiences for Blockchain (and everything else)</title>
      <link>https://skella.com.au/writing/designing-better-experiences-for-blockchain-and-everything-else/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/designing-better-experiences-for-blockchain-and-everything-else/</guid>
      <pubDate>Wed, 02 May 2018 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Blockchain</category>
      <description>Over 20 years of designing interfaces, experiences, and businesses, I’ve iteratively expanded upon and refined a single document - proudly carried from the building of one team to the next, lessons learned and in hand.…</description>
      <content:encoded><![CDATA[<p>Over 20 years of designing interfaces, experiences, and businesses, I’ve iteratively expanded upon and refined a single document - proudly carried from the building of one team to the next, lessons learned and in hand. I thought it was time to share some of this with the world. In doing so, I hope not only to provide insights into the principles that underpin the design and development of user-centred products at Horizon State, but inspire an appetite for better outcomes across all blockchain businesses, where a focus on user-centred design is so desperately needed if visions for mass adoption are to be realised.</p>
<blockquote>
<p>Blockchain startups desperately need more design talent. If your product or service isn’t a 10x better experience than incumbents, you’re not going to beat the incumbents. No end-user cares about “Blockchain,” just like they don’t care about TCP/IP. - Jamie Skella on Twitter, April 2, 2018</p>
</blockquote>
<h2 id=what-is-ux>What is UX?</h2><p>Despite what some might believe, good design isn’t merely visual design, usability testing, or common sense. In reality, accessibility best practices, behavioural psychology, language and tone, interaction patterns, information architecture, colour theory, and more, are all critical parts of what makes up a great user experience. This isn’t just the responsibility of experts in these fields, however. User experience is everyone’s job, from development to support, and their ability to empathetically put themselves in the shoes of users, and take on the responsiblity of shouldering the burden of complexity — instead of passing it on — is of paramount importance. Read more on this subject in my article “What UX isn’t”.</p>
<blockquote>
<p>“Design is not just what it looks like and feels like. Design is how it works.” — Steve Jobs</p>
</blockquote>
<h2 id=the-ux-process>The UX process</h2><p>Design is cyclical. Design never ends; design is never done. User satisfaction can only be maintained with a company culture that sees change as opportunity, not as risk. Expectations of every digital product will continue to change, generally for worse, when the product doesn’t also change. Market, technology, culture, and competition are all factors for the evolving opinion of your product.</p>
<h2 id=the-real-mvp-minimum-viable-product>The real MVP (Minimum Viable Product)</h2><p>If a product is merely functional, it’s unlikely that it’s truly viable. A real minium viable product understands that for a product to be adopted and embraced, it requires much more than to simply work. In respect to continual improvement, a real MVP creates feedback loops early, which redefines what your next steps should be as not all features or ideas turn out to be good ones. It doesn’t matter how good you are at building software if you’re building in the wrong direction.</p>
<h2 id=the-value-of-good-ux>The value of good UX</h2><p>What is the sum of being easier to use, more delightful and improving time to task completion? What happens when design is embedded amongst business leadership and utilised as an innovation resource?</p>
<p>Design-driven businesses have outperformed the S&P by 228% over a decade, while on the London Stock Exchange they outperformed FTSE by more than 200%. Learn more about these figures and others in my article “The Bottom Line Value of Design”.</p>
<p>What these companies practice, is ultimately informed design. To practice informed design, you need to intimately understand who you’re designing for…</p>
<h2 id=research-and-testing>Research and testing</h2><p>Prior to bottom line impacts, what does user testing achieve within the business?</p>
<ul>
<li>An ongoing feedback loop for stakeholder consideration</li>
<li>The fostering of a culture that makes informed decisions</li>
<li>Real user guidance on what’s useful and what’s usable</li>
<li>The revealing of issues overlooked internally</li>
<li>Data-driven reasoning that squashes in-house politics and disagreements</li>
<li>Baseline data and documentation for change comparison</li>
<li>Understanding what users really do, not just what they say</li>
<li>Decrease support costs and reduce design/dev rework</li>
</ul>
<p>Practices to live by:</p>
<ol>
<li>Informed design is the critical practice of making conscious decisions, backed by real information rather than assumption.</li>
<li>Even the minimum amount of usability testing can deliver extraordinary results. Test everything you’re working on, with a specific problem or objective in mind.</li>
<li>Everyone should be observing people using our products an hour every month. It’s all too easy get too close and lose objectivity.</li>
</ol>
<p>With the above in mind, it’s also important to maintain a culture of being data-informed, but not necessesarily data-dictated. Sometimes a giant leap away from the ‘local maxima’ requires not merely iterating upon designed solutions, not merely incremental improvements. There’s no harm in appraising problems with a clean slate and imagining brand new solutions all over again, as long as all assumptions are qualified, tested, and measured.</p>
<p>If you A/B test the two worst options, your best possible outcome is only the second worst one. Data gathered is only done so based on current strategies and product realities — not possibilities — and can be interpreted with bias. Data alone does not create great products.</p>
<h2 id=6-principles-for-more-human-design>6 principles for more human design</h2><h3 id=1-pursue-simplicity>1 Pursue simplicity</h3><p>Visually complex interfaces are rated less beautiful than simpler ones, which add additional work for the brain to decode, store and make sense of.</p>
<blockquote>
<p>“Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away.” - Antoine de Saint-Exupery</p>
</blockquote>
<h3 id=2-always-consider-visual-hierarchies>2 Always consider visual hierarchies</h3><p>It’s not enough to merely have great content — you need eyes to be drawn to what’s important, and create consistency in these treatments. Humans are pattern recognition machines, so pay attention to everything from weighting which enables quick scanning, to commonality in the placement of navigation elements.</p>
<blockquote>
<p>“Clutter and confusion are failures of design, not attributes of information.” — Edward Tuft</p>
</blockquote>
<h3 id=3-carefully-craft-affordances>3 Carefully craft affordances</h3><p>Follow what works for your users, not trends. While skeuomorphic design continues to take an increasingly distant backseat to flatter design languages, to ignore the usability benefits of certain treatments for certain user demographics is never a good idea. Sometimes what looks better doesn’t work better.</p>
<h3 id=4-use-appropriate-typography>4 Use appropriate typography</h3><p>Using a font perceived as something comical probably isn’t the right fit when documenting one of the greatest discoveries in modern science. Typeface characteristics not only affect aesthetic appeal, comprehension, but credibility as well. In fact, given the same statement in Comic Sans versus Baskerville, the one in Baskerville is more likely to be believed.</p>
<h3 id=5-talk-like-a-real-person>5 Talk like a real person</h3><p>“PC LOAD LETTER”. HP’s infamously thoughtless language frustrated users around the world, and cost the company tens of millions in unnecessary support time. “What am I supposed to load into my PC, exactly?” Of course, in America, A4 paper is called letter, and the eventual realisation is that you need to re-fill the paper tray. Design is in the details, and every detail matters — overlook them at the peril of user satisfaction and profitability.</p>
<h3 id=6-dont-make-people-think>6 Don’t make people think</h3><p>Don’t insist on requesting information (such as personal details) before demonstrating value. Don’t expect recall (such as where content is) when you can design for recognition. Don’t ask questions (such as location) when you can make intelligent initial assumptions. Don’t make the user trawl through options (such as to hide or show a pane), when you can remember their decisions contextually from the place where it exists.</p>
<p>Most importantly, acknowledge that being “intuitive” isn’t always possible. Intuition can only exist if a user has had prior experinces that resemble yours. The first mouse to be manufactured en masse for consumers was not intuitive, but learning how to use it was fast. Sometimes the greatest ideas cannot relate to prior experiences, so strive to be quickly learnable, not “intuitive”.</p>
<h2 id=ux-design-is-not-voodoo>UX design is not voodoo</h2><p>A note worth closing on is that good design is largely pragmatic, not magic. Above all else, good design is achieved as a result of conscious, collective decisions to avoid personal bias, avoid anecdotes, and together employ a genuine care for how people will react to the choices we make for them.</p>]]></content:encoded>
    </item>

    <item>
      <title>Tales of a Bitcoin Debate</title>
      <link>https://skella.com.au/writing/tales-of-a-bitcoin-debate/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/tales-of-a-bitcoin-debate/</guid>
      <pubDate>Thu, 15 Feb 2018 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Blockchain</category>
      <description>Sitting down beside Leah Callon-Butler at Pause Fest last week, together we would defend the world&#x27;s most famous cryptocurrency in a debate titled &quot;Is Bitcoin a Fraud?&quot; Opposite us, a team of bankers: Gordon Rennie…</description>
      <content:encoded><![CDATA[<p>Sitting down beside Leah Callon-Butler at Pause Fest last week, together we would defend the world's most famous cryptocurrency in a debate titled "Is Bitcoin a Fraud?" Opposite us, a team of bankers: Gordon Rennie, Innovation Director at ANZ, and Shruti Shah, Entrepreneur in Residence at Silicon Valley Bank. A nuanced and thought provoking conversation, with much interest in the debate from individuals who couldn't be there, it made sense to summarise just a few of the arguments which helped maintain the support of the majority of the audience, in favour of bitcoin's legitimacy...</p>
<h2 id=fallacy-bitcoin-isnt-regulated>Fallacy: bitcoin isn’t regulated</h2><p>A wide variety of regulations have applied to the use of bitcoin and other cryptocurrencies, and numerous judgements have been passed. While only some authorities have set guidance on how existing regulation applies, let alone have defined explicit regulations, this doesn’t excuse or permit use without consideration or consequence according to current regulation and law. Being a currency, all existing legal and monetary frameworks must be thoroughly considered in whatever context you’re dealing with bitcoin — everything from investing as an individual, to money transmission as a business, and related taxation implications. Furthermore, KYC and AML practices are common place at exchanges. Any assertion that people are free to do as they please with bitcoin is ignorant at best, and intellectually dishonest at worst.</p>
<h2 id=things-can-be-worth-whatever-we-decide-they-are>Things can be worth whatever we decide they are</h2><p>“Bitcoin isn’t backed by anything.” One might argue that neither is the fiat currency in your pocket. Ignoring the merits of that contest, bitcoin does in fact represent the first currency which doesn't actually need to be backed by the physical world — it's backed by mathematics and prevents the kind of arbitrary inflation that causes currencies to collapse. Value is often whatever we say it is, and we value many things in our lives which aren't "backed", such as digital in-game items. It's worthwhile pointing out that plastic notes superseded paper notes, which replaced coins, which replaced trading precious metals or artefacts, which replaced barter. In each case, fungibility improved, and exchange became more convenient — let alone more efficient. Bitcoin and cryptocurrency represent the next obvious evolutionary step in improving those metrics, and others, significantly.</p>
<h2 id=bitcoin-has-arrived-just-in-time>Bitcoin has arrived just in time</h2><p>We are accelerating toward a future where cash as a physical medium of exchange will fade into obscurity, and maybe even disappear entirely. With this in mind, it’s important to acknowledge that what bitcoin represents — technical marvels aside — is not new. It represents a way to continue peer-to-peer trade, and maintain the management of at least some of our money. The alternative, had cryptocurrency not arrived, would be a near-future world where banks, governments, and other institutions have authority over the flow of all money, and control our access to it. This mightn’t bother you so much in what are developed, relatively stable economies, where the risk of your funds being frozen or seized are currently slim. Yet, in places like Zimbabwe, Venezuela, or Cyprus as just some examples, that possible future is likely to bother you deeply. I would posit the idea that not only is cryptocurrency and the blockchains that underpin them a watershed moment for human progress, but that they will play a critical role in steering our civilisation away from an Orwellian future.</p>]]></content:encoded>
    </item>

    <item>
      <title>Taking advantage of our unknown future</title>
      <link>https://skella.com.au/writing/taking-advantage-of-our-unknown-future/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/taking-advantage-of-our-unknown-future/</guid>
      <pubDate>Tue, 04 Jul 2017 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>At a recent conference run by the Australian Chamber of Industry (ACCI) and Business SA, I spoke to a room filled with both established and aspiring entrepreneurs. A lot of ground was covered, from how autonomous…</description>
      <content:encoded><![CDATA[<p>At a recent conference run by the Australian Chamber of Industry (ACCI) and Business SA, I spoke to a room filled with both established and aspiring entrepreneurs. A lot of ground was covered, from how autonomous vehicles will change cities, to how augmented reality will change behaviour. In the leadup to this conference, I spoke in detail with the organisers about those topics, and many more. Of pertinent importance to the conference audience was my point that in a time of unprecedented change, we are in control of what happens next. Instead of preparing for the future, we can design it.</p>
<p>Here are the highlights from my interview with ACCI…</p>
<h2 id=if-your-business-is-not-aware-of-future-technological-opportunities-are-you-in-danger-of-being-left-behind>If your business is not aware of future technological opportunities, are you in danger of being left behind?</h2><p>In one word: yes. One good example of technology that every business needs to think deeply about right now is autonomous vehicles. On the surface these look like something that are going to put taxi and Uber drivers out of work. Then you begin to realise it’s truck drivers too, and delivery men. Then you start to think about the implications for government revenue: these vehicles won’t speed, won’t run red lights, won’t drive drunk, nor park in a handicapped spot. As fewer cars crash, the role of insurers is also redefined. If a car can drive itself, it doesn’t need stay parked in the place you got out of it, so inner-city car parks won’t be required for many more decades either. What about personal injury law practices, and crash repairers, and traditional hire car companies? Autonomous vehicles have profound and far-reaching knock-on effects for all industries well beyond transport.</p>
<p>If you don't have people looking to the future, identifying possible downsides and upsides on how emerging technologies might change your business, you are setting yourself up for a big hit. Potentially, a big fall. It's just a matter of time. Change will happen, whether you’re paying attention to the effects of it or not. It is of incredible importance to have the right people thinking deep and long about the implications of emerging technology, so your business can capitalise on it, rather than be a victim of it.</p>
<h2 id=if-we-cant-really-predict-what-the-future-will-bring-what-is-the-cost-when-you-guess-wrong>If we can’t really predict what the future will bring, what is the cost when you guess wrong?</h2><p>A scarier question is “what is the cost if you don’t guess at all?” At a certain point — trying not to be too early or too late — you have to make a judgement call. People love talking in absolutes, especially when thinking about the future, and there’s no problem with that... if it's not driving decisions alone. Early on, talking in absolutes can do unforeseen strategic damage. Eventually though, despite remaining doubts or unanswerable questions, you do have to draw a line in the stand, create a strategy for the future being discussed, and move forward.</p>
<p>As someone who works on this kind of foresight strategy, it’s not our job to predict an absolute future. It’s our job to thoroughly understand the possibilities, understand what the business consequences could be, and then work towards the future desired. People forget that the future is something we craft. Rather than just trying to adapt or be ready for it, with enough research about the present and past, with enough thought on what is possible later, in a very real way we can create the future.</p>
<h2 id=how-much-time-do-you-put-aside-to-do-this-kind-of-thinking-do-you-make-it-a-priority>How much time do you put aside to do this kind of thinking? Do you make it a priority?</h2><p>I do. I don't read a lot of books, though. No fiction ever, and few non-fiction — but I do read a lot. There seems to be a popular sentiment that smart people read a lot of books, but what's of value is reading lots of high quality information that matters to you, whatever that is and wherever you find it. Critically, however, don’t read for the sake of reading. Read only what you’re interested in. When you're interested, you’ll learn far faster than something that makes you yawn. If it's not something you're passionate about, you'll never be exceptional in it anyway.</p>
<p>I always recommend that people make some time to read about what interests them, every single day. Even if it's just the 30 minutes they'd otherwise spend watching a TV show they don't really like much anyway. Of course, the pre-requisite for all self-education is motiviation and, yes, curiosity. I'm not sure this curiosity is inate, but I hope it is. Our capitalist reality may dampen that curiosity, with the loudest messages encouraging consumption over creation, but curiosity is definitely something that can be cultivated. It starts with not just accepting things as they are, by asking “why” and “how” about everything in our lives. It's especially important we teach our kids to do this. The more people who begin to actually understand how the modern world around them works, and why it works that way, the higher frequency of better ideas they will have. From the electricity that charges their iPhone, to why they pax taxes and where that money goes... knowing these things are intellectually enabling. We take too much for granted.</p>
<p>Reading daily is simply a good habit to get into, and we are creatures of habit, after all. With enough repetition, you're able to define your habits. To use the old adage “you are what you eat”, what you eat is just another of your habits. In fact, you are your habits — once you've defined enough of them, they end up defining you. So make sure they're ones you're proud of.</p>
<h2 id=i-suppose-technologies-like-autonomous-vehicles-will-free-us-from-some-of-the-mundane-and-give-us-all-more-thinking-time>I suppose technologies like autonomous vehicles will free us from some of the mundane, and give us all more thinking time?</h2><p>There are pros and cons with all technological advancement, but what has been demonstrated throughout recent history is that, so far, we've ended up in a place of overall betterment. We shift culture and behaviour to meet technological change, but only when it’s more beneficial to us than it is detrimental. The future we arrive at will be a byproduct of our choices as creators and consumers. Technology is not forced upon us against our will. People choose technology when they see more gains than pains in many aspects of their lives because of it — indeed, such as small efficiency gains that cumulatively mean less menial and repetitive things, and more time for things we care about instead.</p>]]></content:encoded>
    </item>

    <item>
      <title>The future of human experience</title>
      <link>https://skella.com.au/writing/the-future-of-human-experience/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/the-future-of-human-experience/</guid>
      <pubDate>Wed, 21 Jun 2017 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>I recently spoke about the future of technology and experience on a panel hosted by David Di Sipio, accompanied by the brilliant Ben Gilmore and Will Egan. Here are highlights from my conversation with the host, the…</description>
      <content:encoded><![CDATA[<p>I recently spoke about the future of technology and experience on a panel hosted by David Di Sipio, accompanied by the brilliant Ben Gilmore and Will Egan. Here are highlights from my conversation with the host, the panelists, and the audience...</p>
<p>Mobile was the last big paradigm shift in consumer computing behaviour. We're now on the cusp of new shifts including augmented reality, virtual reality, and BCI (brain-computer interfaces). Before things like AR and VR take hold, we also have the world of VUI (voice user interfaces) to think about as professionals and as consumers.</p>
<h2 id=what-are-some-suggestions-you-have-for-preparing-for-those-future-technologies>“What are some suggestions you have for preparing for those future technologies?”</h2><p>Wherever you fit in along the spectrum of being a technologist, an entrepreneur, or a designer, it's more important than ever to consider “what is the world going to look like in 5 years?” It's looking as though the 2D interfaces we use now will begin to become less relevant, and I see a time in the not too distant future is there will be far less reliance on any interface that has visual queues, such as buttons. When we begin to think about a time in the near future where you could ask Alexa, “I need a new BBQ that’s going to fit into my courtyard and can cater for 6 people. Buy the highest rated one you can find that I can still get delivered by this Saturday for less than $800,” you begin to appreciate profound longer term changes in the way we use technology, and how we design for it.</p>
<p>That sort of improvement in the contextual understanding of your environment, your preferences, your personal history — then making important decisions for you based on those — reveals a world where we’ll have to use Google less. We will no longer need to type into search boxes and look at lists of results.</p>
<p>If we no longer use visual interfaces as we know them today to perform tasks such as search, that is a dramatic shift for those who work on these interfaces, let alone the companies who profit from them. As UX designers and developers, it's in your best interests to divorce yourself from 2D interface exclusivity. We’re about accelerate into 3D space and zero space. Make sure you’re thinking beyond just 2D panes.</p>
<h2 id=what-role-does-the-study-of-neuroscience-play-in-the-future-of-creating-experiences>“What role does the study of neuroscience play in the future of creating experiences?”</h2><p>I wouldn’t really know where to start, I’m certainly not a neuroscientist — but I’m happy to talk about what little I do know in this general area. For people wanting to understand interfaces beyond today's most common ones, grab yourself an Emotiv headset. From memory, it's a few hundred bucks, and it will let you train computer software to do things such as push and pull on-screen objects using your thoughts alone.</p>
<p>The problem with BCI right now is that there’s little commonality in how that push is achieved from person to person. Every individual has to train this pushing and pulling for themselves, instead of it just working out of the box. Elon Musk is now working on something called Neuralink, a next generation interface that intends to take this to the next level. Musk wisely wants to create a world where we can augment our intelligence to match that of increasingly sophisticated AI (artificial intelligence). You’ve also got Facebook working on brain-to-computer interfaces which let you 'write' at over 100 words to minute just by thinking the words, rather than saying them or typing them.</p>
<p>These BCI technologies could converge quite nicely in terms of timing and maturity with VR and AR, changing how we participate in simulated environments, and how we control augmented reality overlays — without using flaky gestures or typing. Anyone who has used Microsoft's HoloLens will understand just how frustrating user input can be as things are today.</p>
<p>Somewhat of a tangent, it's fascinating to consider that our brain itself doesn’t expreince anything directly — it’s trapped in darkness and silence within the walls of our skull. It’s only using the signals our body creates that it expereinces the world. Thinking about the ways we interface with our brain right now, we’re kind of already cyborgs. We strap into a car and we’ve augmented ourselves with the superpower of speed and an armoured shell. You can think about our phone and the internet as effectively unlimited memory, allowing us to access all of the world's information in close to an instant.</p>
<h2 id=is-ar-really-just-about-advertising-and-vr-the-patform-that-could-be-far-more-meaningful>“Is AR really just about advertising, and VR the patform that could be far more meaningful?”</h2><p>That might be where it is percevived to be right now, but I don’t think it’s where we’ll end up with these technologies. Where we’re headed with AR is something much more profound than current examples. I’m aligned with Apple on the belief that AR is going to be the next personal computing paradigm. In a sense, it will probably be the successor to the smartphone.</p>
<p>If you stop thinking about AR as holding up your phone in front of a magazine to reveal extra content, or holding it up on the street to see a Pokemon you need to catch, and instead start thinking about what kind of information could be dynamically displayed in your environment via a pair of spectacles, a contact lens, or indeed a neural interface, then the opportunities for personal benefit and efficiency are immense. Everything from real time directions, to ratings about a product you're looking at, and even the capacity of arriving buses — plus so much more. It will likely become our next 'default device' for general access to information. Whatever we need know at any moment, it can be presented within our field of vision.</p>
<p>On the topic of VR, I believe it's the ultimate empathy machine. The ability to educate people about other times, other places, and indeed other people, will begin to reduce conflicts caused by the pathological boundaries we create, such as those of religion and politics... VR will ultimately be an aid in reducing racism, sexism, and much prejudice. To be able to expreince someone’s life through their eyes, to understand what it means to be poverty stricken, or fleeing war, as if you were there... Yes, right now it’s low resolution, but soon enough it’ll be high resolution and we’ll be plugging in more of our senses. Imagine a student being able to experience the Roman Empire, to have it rebuilt and rendered, and then experienced as if they were there, instead of just reading about it in text.</p>
<p>VR and AR will serve generally different purposes. One will be about utility and efficiency, the other will be about experience and escapism. Beyond that, it's awe-inspiring to think about VR as the final platform. By that, I mean we could actually simulate anything inside of VR, including AR. Indeed: we could put on AR spectacles inside of our VR experience.</p>
<h2 id=what-is-the-social-impact-of-this-kind-of-experience>“What is the social impact of this kind of experience?”</h2><p>There’s a natural tendancy to resist and reject progress based on your own nostalgia, experiences, and behaviours. In reality, millennials aren't victims of screens — they're the most social generation ever. Technology is enabling them to scocialise more than anyone ever has before, but they’re socialising in ways very different than what older generations are accustom to. This is increasingly true for many, many aspects of how we live our lives.</p>
<p>The important thing to remember is that we are continually designing our future. These changes are not happening to us against our will — it’s us who decides what we do and how we do it. We create these technologies, and collectively we accept them or reject them. If the value that a technology presents us is meaningful enough, then we make tradeoffs for it. We give up some privacy to be on Facebook because of the benefit, as an example. This will continue: tradeoffs for benefit versus potential deteriment. As long as we are net-gaining on our benefit and leaving comparative detriment in the dust, then we are always going to be in an overall better place.</p>]]></content:encoded>
    </item>

    <item>
      <title>A blockchain explanation your parents could understand</title>
      <link>https://skella.com.au/writing/a-blockchain-explanation-your-parents-could-understand/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/a-blockchain-explanation-your-parents-could-understand/</guid>
      <pubDate>Sat, 03 Jun 2017 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Blockchain</category>
      <description>It&#x27;s happening in an increasingly frequent manner: &quot;Jamie, explain this blockchain stuff to me. I&#x27;ve read a bunch of articles and I&#x27;m no wiser.&quot; The problem with most blockchain explainers is that they provide more…</description>
      <content:encoded><![CDATA[<p>It's happening in an increasingly frequent manner: "Jamie, explain this blockchain stuff to me. I've read a bunch of articles and I'm no wiser." The problem with most blockchain explainers is that they provide more detail than what matters to most people, using language that is foreign to most people, which winds up leaving people more confused than when they started. Instead, without worrying about being a technically perfect description, here's an explanation of blockchain your parents could understand...</p>
<h2 id=first-lets-reframe-blockchain>First, let's reframe "blockchain"</h2><p>Starting with terminology, let's ditch unfamiliar words and talk plainly - I promise you won't read about 'nodes' or 'hashing' in this article. Jumping right in, a blockchain is a distributed database, otherwise known as a distributed ledger. To make things really simple and relatable, let's call that ledger a record book instead. Furthermore, let's talk in terms of it being shared instead of distributed. For even greater context, think about a 'block' as a line item in that shared record book.</p>
<p>So, for the purposes of this article, we're going to call blockchain a shared record book. Each addition to this record book is a new line item.</p>
<h2 id=a-shared-record-book-this-doesnt-sound-very-complex>"A shared record book? This doesn't sound very complex."</h2><p>Technology is only as complex as salesmen (and conmen) want it to sound. It's all created by people no smarter than you or I, and it's all quite simple when you break it down. That said, the modern version of this shared record book is indeed a technological triumph, one that is looking to shape our future in profound ways.</p>
<p>To be clear, this isn't just one record book stored in a central location that is shared by many. There are thousands of copies of this record book, stored on computers all around the world, both home computers and business servers - hence the term "decentralised". This record book can be used to record many kinds of things, however I'll use sending and receiving money as the primary example, as it's the most common one right now.</p>
<p>When John wants to send money to Sue, a new line item is created detailing that transaction. This line item then gets sent off to hundreds of other computers who have a copy of the record. Those computers confirm that this transaction is authorised, and ultimately they agree (or disagree) that everything about the transaction is legitimate before giving that line item a tick of approval. It has to match up perfectly on every copy of the record.</p>
<p>It's as if John and Sue had a few hundred mates stand around them and watched John hand Sue the money in question, and they all agreed that he really did hand her the money, as well a other aspects of the transaction, such as it being the right amount.</p>
<h2 id=how-is-this-different-to-a-bank>"How is this different to a bank?"</h2><p>The genius of this shared record book is that it requires no bank, no centrally owned company, and you don't have to place your trust in any financial institution... there doesn't need to be any middleman of any kind.</p>
<p>To elaborate, this shared record book is not owned by any one individual or organisation. It's owned by everyone who has a copy - but that doesn't mean any one person who has a copy has control (more on this soon). Additionally, this record book is what we call "immutable", or in layman's terms, it's irreversible. Every line entry made will exist in perpetuity, for as long as the internet exists. If Sue wanted to refund John's money, this would be a new line item sending the money back - not the crossing out of the original transaction.</p>
<p>Because of those technology design decisions, fudging line items in this shared record book is impossible. If someone who has one or more copies of the record book on their computers was to try and dishonestly change it, those changes would be rejected by the many computers used in the verification process - things wouldn't match up. "Um, John, that's Monopoly money."</p>
<h2 id=but-where-is-the-money-stored-if-there-is-no-bank>"But where is the money stored if there is no bank?"</h2><p>Here's one of the more challenging parts of this to grasp: when we're talking about digital currency such as Bitcoin, there's no repository of coins - that line item in the record book is the money. Pretend for a moment that the first entry in the book was by someone named Jessie - the founder of this new digital currency - who writes "1 million coins now exist". Jessie then hands them out to lots and lots of people, creating a new line item for each transaction. Jessie sent 500 to Bill, 1000 to Sue (Jessie likes Sue more, obviously), and so-on.</p>
<p>To receive those coins, Bill and Sue would have provided a wallet address to Jessie, which is the equivalent of your account details provided to receive a direct deposit with your bank. Bill and Sue each have a very long, very secret codes which give them ownership of the line items that relate to their wallet. In this way, only they can create new line items with the coins that have been sent to them. Once Bill has created a new line item that says he has put 50 coins in Sue's wallet, he can no longer control where those coins go from now on - only Sue can. This is how millions of people can have a copy of the record, without being able to add new line items relating to any of the other 1 million coins that are documented in this shared record book.</p>
<h2 id=beyond-banking>Beyond banking</h2><p>At this point, you might be thinking that this doesn't sound like it's going to change the world at all. It's just a way to verify the ownership of something digital, even if there are identical copies... right? If you think about that sentence for long enough, it will already begin to dawn on you how big of a deal that is. Up until now, a copy of something digital was indistinguishable from another. If an mp3 could be used as currency, there would be no way to tell who's copy of "Madonna - Like a Prayer.mp3" was the real one that I should exchange for goods and services, and which was a copy. Suffice to say that prior to this technology, a truly digital coin was not possible - someone could just copy a coin a million times and be a millionaire.</p>
<p>Beyond disrupting the world of finance, the most mind-bending, earth-rocking uses of this technology actually have nothing to do with money specifically. With those fundamental workings explained, I'm now going to provide a few real world examples that will give you an idea of just how this technology really could change the world. These examples are powered by shared record books, which bind agreements using 'smart contracts'. I won't go into detail about smart contracts here, but these enable the setting of parameters to determine when, why, and how new line items in the record book are created, generally based on each person upholding their ends of the deal.</p>
<p>The future of democracy? Over at Horizon State, we’re using this same shared record book technology for voting. In effect, we've replaced Bitcoins (which represent currency), with tokens of our own, which represent votes. Instead of spending a Bitcoin, you're lodging a vote. The same benefits we see with digital coins apply to these votes: their authenticity and legitimacy is validated by many people's computers, and the record book of votes can never be tampered with. It exists to be recounted with the same result, forever.</p>
<p>The future of music? What if when you opened up an app kind of like Spotify, that instead of Spotify charging you a subscription, and paying royalties to artists, you paid the artist directly? As an example, once you're 35 seconds into the length of a song, a cent or two was paid directly from your wallet to the artist's. The record of what you've listened to and what you've paid is verified by many other computers, and the record is set in stone. Ujo Music and VOISE are working on ideas like this right now.</p>
<p>The future of file storage? Rather than store your files on Dropbox or Onedrive cloud servers, what if your files could be split up into tiny chunks and stored on thousands or millions of people's computers around the world? The record of what parts of files you own and where they are cannot be changed - only you have the key to view the pieces as a whole, and no organisation owns your data. This is the sort of thing that Storj and Sia.Tech are working on as I write this.</p>
<p>The future of energy? Imagine a country filled with Tesla Powerwall equipped houses. Instead of being "off grid", they're very much on it - but they're not paying an energy provider for their kWh. In fact, in this future there's no need for traditional energy providers at all anymore. Instead, houses automatically generate, store, and trade electricity between themselves based on which neighbours need extra, and which have lots of excess in their batteries. Thanks to blockchain, this is no longer science fiction - it's being worked on everywhere from Australia to New York City.</p>
<h2 id=im-in-sign-me-up-to-the-future>"I'm in - sign me up to the future!"</h2><p>I'm afraid you're too early - this is cutting edge stuff that's in its infancy. A large majority of blockchain projects announced are not yet released, while the ones that have been kind of feel as a premature and experimental as many of the world's first websites did. I do foresee that blockchain based technologies will underpin a good chunk of the Internet that you use in the next 10-15 years, but like today's Internet, you shouldn't and wouldn't have to understand how a blockchain enabled internet works - it just will. Despite that, now you've got some insight into what will be happening behind the curtain when this next kind of internet creeps into every aspect of our lives, dismantling centralised businesses as they exist today along the way.</p>
<p>I'm sure you still have a lot of questions. As alluded to in my opening paragraph, this isn't intended to cover off every detail, be technically precise, nor be particularly comprehensive. However, armed with this kind of explanation, working towards an even stronger understanding of blockchain should be far easier than before.</p>
<p>That's it! Go forth, and tell your parents about blockchain.</p>]]></content:encoded>
    </item>

    <item>
      <title>The future of work, and our ultimate end</title>
      <link>https://skella.com.au/writing/the-future-of-work-and-our-ultimate-end/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/the-future-of-work-and-our-ultimate-end/</guid>
      <pubDate>Thu, 20 Apr 2017 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Work and strategy</category>
      <description>I like to think about the future. A lot. Both our immediate future so I can make better business decisions, and our distant one so I can make better decisions about what business to be involved in. An incredibly hot…</description>
      <content:encoded><![CDATA[<p>I like to think about the future. A lot. Both our immediate future so I can make better business decisions, and our distant one so I can make better decisions about what business to be involved in. An incredibly hot topic right now is the subject of automation, AI, robotics, and what that means for us humans and the future of work. Very recently, I made some comments on social media about this, which generated some important conversions - ones worth sharing. Below are just some of the questions, and my subsequent responses.</p>
<p>As a qualifying statement, I don't expect to be right - futurists should be considering options for our future without being wedded to absolutes, as they simply won't be right. I also don't expect you to agree with my view on the theme of the possible future described. However, that possible future - as well as the alternatives - are ones that everyone needs to start thinking about right now, as our pace of change is accelerating to points in time which could easily become existential risks for humankind.</p>
<p>Full disclosure: I'm an optimist. Not unrealistically so - I'm certainly not panglossian - but in my life I've seen no meaningful progress derived from pessimism. Some of the concepts described may seem far fetched, but it's important to remember that culture and our values shape themselves around technological progress and related benefits - not the other way around. Who reading this knows someone who said "I would never ride in a stranger's car"? Because of this, it's even more important that we steer technological progress rather than just be a passenger of it. Of course, to do this, we need many activists - people willing to fight for their idea of a better future. Somewhat of a call to arms, I ask that if you're worried about the future, find a way to play a part in designing the one you want - it won't just unfold.</p>
<p>Brace yourselves, this spirals quite fast...</p>
<h2 id=will-the-next-jobs-revolution-be-different-to-past-ones-weve-been-here-before-the-printing-press-the-steam-engine-etc>"Will the next jobs revolution be different to past ones? We've been here before: the printing press, the steam engine, etc."</h2><p>What is likely to remain the same: tech will destroy and create many jobs.</p>
<p>What is likely to be different in future: tech will destroy many basic jobs (e.g. clerical), but instead create fewer complex ones (e.g. engineering) to replace only some of them. Tech will increasingly do this faster and faster, meaning less time to re-skill for new complex jobs than the time once permitted to re-skill for new basic jobs.</p>
<p>One thing is almost certain: if you're expecting to get a degree and do that job for life, you're mistaken.</p>
<h2 id=so-a-society-and-economy-that-increasingly-does-away-with-the-need-for-people-an-upper-class-of-tech-owners-producing-product-for-a-population-who-cannot-afford-to-purchase-it>"So, a society and economy that increasingly does away with the need for people? An upper class of tech owners producing product for a population who cannot afford to purchase it?"</h2><p>Where we can end up is a time where we've resigned our role as economic actors, living in a world of automated abundance. Transport, food production, cleaning... everything, always, without a stress in the world. AI and robots won't need owners or operators at that point - they would self maintain and any iteration beyond their own self learning, optimisation and adaptation would be subject to open source collaborative iteration by the communities they serve.</p>
<p>The question in my mind is: how rocky does it get before we make it there? Without measures such as a universal basic income, and ones we haven't even thought of yet, the time between now and then looks nearly certain to install chaos.</p>
<h2 id=how-do-we-reframe-the-sense-of-purpose-the-average-person-needs-to-get-fulfilment-out-of-life-in-the-coming-years>"How do we reframe the sense of purpose the average person needs to get fulfilment out of life in the coming years?"</h2><p>When you break down what work is, fulfilment doesn't truly come from "work". It comes from mechanisms that make up work: creation, collaboration, sharing, socialising, playing, achievement... We can maintain all of these mechanisms without work as we know it today. In fact, that automated world I speak of should provide the option to do these with more freedom than now.</p>
<h2 id=what-does-the-average-person-do-to-occupy-themselves-day-to-day-you-are-going-to-end-up-with-a-lot-of-idle-hands-doing-the-devils-work>"What does the average person do to occupy themselves day to day? You are going to end up with a lot of idle hands doing the devils work."</h2><p>Depending on which technologists, futurists, and philosophers you listen to, much of the population may well be engaged in a return to uniquely human pursuits, such as art and indeed philosophy - deeply engrossed in the process of creating and thinking in ways only humans can. Even if we develop general artificial intelligence, with consciousness, it's almost unfathomable to think it will be the same kind as ours. Many others might spend much of their time being entertained, spending time with friends and family, or in simulations.</p>
<p>I would argue that idle hands do not lead to the devils work. Idle hands lead to creation, exploration, philosophy. History has demonstrated that with added convenience and efficiency gains, which deliver more free time to us, the species has accelerated even faster. A simple example of this is our transition from hunting and gathering to practices like agriculture, freeing up much time once used to hunt and gather.</p>
<h2 id=i-just-feel-like-its-a-big-over-estimation-of-the-amount-of-people-capable-of-those-pursuits-so-we-put-them-all-into-simulations-simulations-of-what>"I just feel like its a big over estimation of the amount of people capable of those pursuits. So we put them all into simulations? Simulations of what?"</h2><p>The people not capable or willing to undertake those kinds of pursuits will do the same thing they love doing right now. They'll engage in what they find fulfilling, which is things such as entertainment and socialising. They'll put themselves in simulations much like the ones people already spend time in: Xbox and religion. When you break it down, these are points based systems, providing goals, with next levels to get to. Some of these systems and points are superficial - people might argue the modern corporate ladder fits that description - and some are obviously profound. People partake in these simulations for reasons including escapism (from reality and work, ironically) and competition. I think people will continue to do so. People find enormous meaning in their current simulations. They find purpose and pleasure.</p>
<h2 id=entertainment-and-socialising-is-not-enough-if-you-have-no-work-to-compare-it-to-then-it-ceases-to-be-something-of-value-the-novelty-will-wear-off-very-quickly>"Entertainment and socialising is not enough. If you have no 'work' to compare it to then it ceases to be something of value - the novelty will wear off very quickly."</h2><p>There is "work" in entertainment and socialising (I'm referring to the aforementioned mechanisms that make work what it is). Status pursuits, the act of creation etc. Think Minecraft. Think Xbox Gamerscore. Achievement and reward are key drivers for why people undertake most things, from conventional work to watching a feel good movie for the subsequent emotional delight. Achievement and reward are found in many places, and this this will be increasingly the case if our future simulations are believable realities...</p>
<h2 id=minecraft-as-an-example-is-that-its-meaning-and-satisfaction-is-diluted-by-not-being-real-there-is-a-lot-of-satisfaction-in-creating-and-playing-in-that-world-but-at-the-end-of-the-day-it-isnt-real>"Minecraft as an example is that its meaning and satisfaction is diluted by not being real. There is a lot of satisfaction in creating and playing in that world. But at the end of the day it isn't real."</h2><p>People dedicate their lives to "simulations" - World of Warcraft and esports come to mind. Lots of people would love to do this forever and never deal with "work". Most people aren't technologists or artists. Most people in the west are actually employed as things such as drivers, or retail and hospitality staff... Many people don't enjoy the things they do for work, nor find fulfilment in them.</p>
<p>Those comparisons aside, future simulations won't be unreal - they'll eventually be real enough for us to not know any different, if we want. If you read up on Nick Bostrom's Simulation Theory, he makes good arguments for why we might already be in one. It's hard to take that concept seriously, but it's hard to deny his logic.</p>
<h2 id=i-guess-then-the-question-is-what-is-the-point>"I guess then the question is - what is the point?"</h2><p>There is no point, except the ones we make. Ultimate nihilism - being the acceptance that none of this really matters (likely to end in a big freeze, big rip, or big crunch) - is a great tool for evaluating what is really important to you personally, pursuing that, and helping others pursue what is important to them if you agree. That has been my personal experience, anyway.</p>
<p>Everyone needs to start thinking about these future possibilities, because how we raise the next generation - and help shape levels of anticipation for these kinds of changes - is going to greatly affect the outcome of the entire species. This includes an important cultural shift from one that sees dignity in "work" to dignity in "creation" more broadly that we must strive toward.</p>]]></content:encoded>
    </item>

    <item>
      <title>More sleep + less work = better business results</title>
      <link>https://skella.com.au/writing/more-sleep-less-work-better-business-results/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/more-sleep-less-work-better-business-results/</guid>
      <pubDate>Tue, 18 Apr 2017 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Work and strategy</category>
      <description>All too often, workplaces have a toxic culture that expects less sleep + more work = better business results. The opposite equation is one I talk about often – from conference stages to company board rooms – which may…</description>
      <content:encoded><![CDATA[<p>All too often, workplaces have a toxic culture that expects less sleep + more work = better business results. The opposite equation is one I talk about often – from conference stages to company board rooms – which may seem counter-intuitive at first. More than talk, I built my last startup around the premise that team members shouldn't set an alarm, and should work whenever they want.</p>
<p>You can read more about those principles in the handbook I created for that business, but this article is not about that business. This article is designed to highlight important examples, research, and statistics which reinforce reasons to adopt similar ways of working for greater degrees of success.</p>
<p>The leadership philosophy of having people work when at their best isn't just one that keeps people happy and healthy. It's an economic weapon; it's a competitive advantage in the race to attract the best talent and deliver better outcomes.</p>
<h2 id=more-sleep>More sleep</h2><p>“Sleep is the single most effective thing you can do to reset your brain and body,” says Matthew Walker, a professor of neuroscience and psychology at the University of California. From the same article, the author concedes: "scientists are learning that shortchanging sleep can compromise nearly every major body system, from the brain to the heart to the immune system, making our inability – or unwillingness – to sleep enough one of the unhealthiest things we can do."</p>
<p>Seemingly ignored by big business and early stage startups alike, study upon study has shown that poor sleep impairs learning and memory, while contributing to the depression of mind and immune system.</p>
<p>Like with most rules, some exceptions exist. There are people who can commit to surprisingly short periods of sleep without a large measurable detriment. That said, you are almost certainly not one of them - even if you think you are. For every 100 people who believe they are part of a "sleepless elite" – being someone who can get less than 6 hours of sleep each night and perform at their best – 97 to 99% of people aren't. Instead, they are involved in the practice of self-harm. Their health, and their full potential, is at stake.</p>
<p>Thankfully, anecdotal understanding and objective hard science isn't being ignored by everyone. Aetna, a US-based healthcare company, pays its workers up to $500 a year if they sleep at least 7 hours for 20 days running. Amazon's CEO, Jeff Bezos, stands by his claim that getting 8 hours of sleep each night is good for Amazon shareholders. Dave Asprey, the creator of Bulletproof Coffee, reiterates this sentiment by saying that starting work at 10 a.m. doesn’t make you a bad person. He cites researcher Michael Breus' chronotype categorisations, which describes the largely genetically-driven factors that make some of us early birds, some of us night owls, and others somewhere in the middle.</p>
<h2 id=less-work>Less work</h2><p>Whether you take the time and make the effort to get the appropriate amount of rest, this only paints part of your 'productivity picture'. Becoming increasingly understood is that we've only got four hours to do our best work each day – it's just that your "best" varies greatly, depending on the quality of your sleep. After about four hours, the quality of your ideas and the time it takes to execute them slide into comparative oblivion.</p>
<p>Anders Ericsson, professor of psychology at Florida State University is no stranger to this topic, who is one of the world’s leading researchers into the psychological nature of human performance. Ericsson has carried out many experiments demonstrating that people can only commit themselves to just four or five hours of concentrated work at a time before their productivity levels are significantly compromised.</p>
<p>While researching his recent book, "Peak: How All of Us Can Achieve Extraordinary", Ericsson studied how Nobel Prize-winning authors managed their time. He found that they spent just four hours a day writing on average. The 'productivity cliff' is a steep one, and many hours are wasted outside of our 'zone'. This pattern of focused work in short bursts is also reflected in the schedules of successful musicians and athletes.</p>
<p>With big corporates being beacons of stability, and startups being champions of innovative operations, you would think both categories of businesses would strive to be the first to do make their workplace environment more sustainable and more desirable. Instead, many continue to venerate overwork, hopelessly. I specifically say hopelessly because managers can't even tell the difference between staff really working 80 hours, and those pretending to.</p>
<p>Overwork is not just a recipe for employee churn and poor quality outputs, however. If it wasn't also clear in the consequences of poor sleep, overwork is a recipe for the erosion of positive culture, and even increasing the number of sick days taken due its affect on health. A Stanford University meta-analysis of 228 studies showed that high job demands raise the odds of physician-diagnosed illness by 35% and long working hours increase mortality by nearly 20%.</p>
<p>Thankfully, company leaders have been starting to pay attention and begin experimenting with the reduction of office hours. Employees of Basecamp (formally 37signals) work four-day, 32-hour weeks for 6 months of the year. Jason Fried, the company's founder, author of "Rework" (one of the most important business book I've ever read) and "Remote", has said that when employees have a compressed work week, they are better at prioritising and have reported that the quality of their work improves.</p>
<p>You might ask, "if we're taking all this extra time for sleep and we're only getting 4 hours of efficient, high quality work done, how will we get the entirety of our work done?" Don't fret, because the Swedish aren't. They've just lowered their work day to 6 hours and are seeing impressive productivity gains because of it. People are getting the same amount of work done, although importantly, they're happier doing it than before.</p>
<h2 id=better-business-results>Better business results</h2><p>The RAND Corporation recently published a study that calculated the business loss of poor sleep. In the United States alone, it was $411 billion — a total GDP loss gross domestic product of 2.28%. Regarding work itself, another recent study found that employee burnout is responsible for up to 50% of all attrition.</p>
<p>Ultimately, happier teams do better work. About 12% better, in fact. Sleep is a profoundly important ingredient for happiness, mental well being, and knock-on successes. It's a prerequisite for improving quality and productivity capacity, and a necessity for the kind of design thinking that can change paradigms - even if working fewer hours.</p>
<p>To succeed into the future, companies will not merely need the biggest teams or teams willing to work the longest hours. The companies that thrive will be the ones who adapt to cultural exceptions and their benefits. These companies will be the ones that do the best job of keeping people happy and create environments where people can do their best work, for the best company gains.</p>]]></content:encoded>
    </item>

    <item>
      <title>Progress, for whom? An open letter to innovation leaders</title>
      <link>https://skella.com.au/writing/progress-for-whom-an-open-letter-to-innovation-leaders/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/progress-for-whom-an-open-letter-to-innovation-leaders/</guid>
      <pubDate>Mon, 14 Nov 2016 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Essays</category>
      <description>If you’re involved in technology and are reading this article on Medium, it’s likely that you sit in a generally liberal, progressive, moderate, diverse, optimistic, affluent metropolis — the likes of Melbourne, Silicon…</description>
      <content:encoded><![CDATA[<p>If you’re involved in technology and are reading this article on Medium, it’s likely that you sit in a generally liberal, progressive, moderate, diverse, optimistic, affluent metropolis — the likes of Melbourne, Silicon Valley, London, and Berlin — wondering what the effect of Trump’s administration will be on technological progress.</p>
<p>Instead of looking at that impact and how to mitigate against any possible future wrongdoings of a Trump administration, we need to more importantly begin reflecting on the conditions leading to this election outcome, which we have helped cultivate.</p>
<p>It’s not a popular opinion amongst my fellow left, but sexism and racism didn’t win this election for Trump, even if the unfortunate truth is that this kind of bigotry is entrenched in the psyche of many. Claiming so is counterproductive to solutions thinking. In reality, this latest election was won by suburban outliers, country towns, and a broader American middle class who were targeted with campaign messages designed to inflate economic anxieties and related resentments.</p>
<p>This result should be a wake up call to us all that there is an incredible disconnect between places like Silicon Valley, and many capital cities — all up in arms about the result—who had forgotten to pay attention to the rest of America. Robotics, automation, offshoring, globalisation and other factors are eroding American jobs, and the value of those jobs… this was an inevitable result, which until now has not been apparent.</p>
<p>That’s not to say robotics, automation, offshoring and globalisation should halt. Quite the contrary. There’s a future ahead of almost unimaginable prosperity for all humans, where we resign our role as economic agents and live peacefully on this planet — and others — finding dignity in creativity and philosophy, instead of “work” as we know it today. But between now and then, it has become abundantly clear that we need to think harder and longer about how to combat the ill effects of our ever-fastening run toward that finish line, or the race might be called off early.</p>
<p>The above chart is a reflection of just how incredible globalisation has been as an equalising force for the planet. With an explosion of Asian economic growth, fewer people now live in poverty than ever before in history, despite our population being at a record high. Child mortality — being deaths under the age of 5 — is the lowest it has ever been, as is homicide and death by preventable illness.</p>
<p>No matter what our parents tell us, no, it was not better in their time. We live in the best time that there has ever been. Yet, even with longer lives globally and higher comparative incomes in developing nations, the developed world’s middle class isn’t seeing the same rise of opportunity. We’ve reached a tipping point, and they’re seeing opportunity diminish. The realities of our new economy mean almost nothing to those once employed by the automotive industry, to coal miners, or soon truck drivers as they’re replaced by autonomous vehicles.</p>
<p>It’s easy to say “you need to continually upskill and reskill, to pursue self learning.” But not everyone is an entrepreneur or academic in spirit or mindset, and they shouldn’t have to be. Many people are busy trying to raise their children and keep up with their mortgage — often far away from our dense city centres. Our pursuit of innovation is relentless, it’s human nature after all, but with every incredible transformation delivered to the world around them, the people that elected Trump only become more fearful; their concerns become exacerbated. They are not a part of that transformation, they are increasingly the remnants of what existed before it.</p>
<p>Shortly after the election result, a snide comment appears out of California: “Silicon Valley lives in a bubble, unlike the rural farmers who take time out of each day to consult a broad cross-section of Americans.” Amusing, somewhat, but completely missing the point. The rural farmers aren’t deciding the future of urban lives at all, let alone without careful consideration. They are not the catalyst for change — we are, the burden is on us.</p>
<p>As a technologist, an optimist, a progressive moderate-left, this is a hard pill to swallow. It’s hard to admit there’s something wrong here, that in fact it’s us who have been ignorant. I’m here to say that we have a problem. If we don’t start to have deeper and more meaningful conversations about what happens as rural jobs disappear and the world of yesteryear is dismantled — partly our own doing — that there may be more Trumps around the world rising to power to represent those alienated by inward looking political and media establishments.</p>
<p>Politically, the left is also missing the point. In doing so, they’re enacting the same kind of ignorance and insular thinking that they so decry in the right, and in white middle America. We need to admit this result is partly our fault, but that we can be a part of the solution. A friend, Andy Aldahn, wrote it best: “When the left caricatures the uneducated, underpaid $30,000/yr man as a privileged deplorable racist if he fears his job sent offshore or if he fears Islamic terrorism coming to his city, then they played some part in pushing that voter away. If they call him a privileged irredeemable sexist if he doesn’t think exactly in their brand of university gender politics then they pushed that voter away.”</p>
<p>To reiterate, this was, in the end, an inevitable result. If not Trump in this election, someone like him in another. Over 60% of Trump voters believe, based on exist polls, that the next generation will be worse than the current. Undoubtedly, Trump’s fear campaign will have attributed to this, but for many of these people it might actually be true. Their vote was their opportunity to exercise a choice between more of the same, or to roll the dice on risk. As Nobel laureate Daniel Khaneman explains in his prospect theory; people are risk seeking when choosing between potential unfavourable outcomes. In Fitzroy, Victoria, just like Silicon Valley, California, it’s easy to get lost in your near-Utopian bubble and forget the fears of so many others.</p>
<p>“Design the future, don’t just let it unfold”... “Disrupt your business before someone else does”… “You can’t stop progress”… “Be biased toward possibility”. These mantras have defined my career.</p>
<p>Indeed, progress cannot be stopped, but it’s time we had a look around us with the same amount of empathy we claim to apply to the outputs of our craft. While it can’t be stopped, progress can be directed. If, instead, we proceed by neglecting large portions of our populations without any real care to educate or support them through change, those people will start to look for someone like Trump — who in desperation are forced to ignore what he will ratify in terms of discrimination of minorities and the disrespect of women, as they clutch at nostalgic ideas akin to making America great again.</p>
<p>If we don’t start asking the question more often “progress, for whom?” — then applying careful consideration in respect to how we move forward in the most inclusive and considerate ways — when general AI and fully autonomous systems arrive, we’re going to be in a world of pain as this sentiment effects an ever larger portion of our population. No one will be excluded from this next leap forward, and it’s not hard to imagine a situation in which many of us are faced with the same challenges as those in shrinking American industries now.</p>
<p>We need to start thinking about how we as technologists are going to make the world better for everyone, how we’re going to appropriately inform and educate people, how we’re going to support them during the process. We’re going to need to find ways to help more people transition to the new world with us… otherwise the world might look very different to the one we’re working toward.</p>]]></content:encoded>
    </item>

    <item>
      <title>Resisting a mobile future is futile</title>
      <link>https://skella.com.au/writing/resisting-a-mobile-future-is-futile/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/resisting-a-mobile-future-is-futile/</guid>
      <pubDate>Thu, 28 Jan 2016 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Consumer technology</category>
      <description>It’s clear that the smartphone is far from finished unseating devices with bigger screens. PC shipments fell a record 10.6% in Q4 2015, despite continued predictions of a PC rebound from analysts. With it, the iPad and…</description>
      <content:encoded><![CDATA[<p>It’s clear that the smartphone is far from finished unseating devices with bigger screens. PC shipments fell a record 10.6% in Q4 2015, despite continued predictions of a PC rebound from analysts. With it, the iPad and other tablets continue to fall from grace, despite the entry of the iPad Pro and Google Pixel C.</p>
<p>With a dwindling interest in desktops, laptops and tablets, some people might ask how anyone is getting anything of value done. The people asking that question might be surprised to learn that 13% of Americans are now smartphone-only, up from 8% in 2013. These are residents who don’t even have a fixed broadband service at home. As reflected upon by Pew, “the reason they do not have broadband at home is because their smartphone lets them do all they need to do online.”</p>
<p>Looking at the trajectory of device sales data, we know that in 2016, the 5 billionth PC will be sold, but the 20 billionth mobile. By 2020, mobile will be at 10x the unit sales and 5x the install base of “PC”.</p>
<h2 id=evolve-or-die>Evolve or die</h2><p>The resulting equation for Mac and Windows developer is simple: ignore iOS and Android at your own peril. Those unwilling to build their software for the world’s biggest computing ecosystem are sealing their own fate. They’re asking to be disrupted in as little as 5 years. The computers that everyone has are the ones that should be targeted, not the ones in decline of ownership and usage.</p>
<p>If you question the viability of productivity on mobile operating systems, on mobile devices, you’d be right to do so . However, you’d only be right to do so for a small and shrinking segment of our workforce. I still need my Mac to design on, which will remain the case until a Sketch equivalent reaches iOS. I’ll also need my mobile to be able to drive a larger screen, which is a likely eventuation. Early indicators of this eventuation include Microsoft’s Continuum, Remix OS and Andromium.</p>
<p>Looking at the top 25 occupations in the USA should quickly help everyone understand how most people wouldn’t have the same computing requirements as I would, nor many people interested in this article would. For many people, their smartphone is the only computer they need.</p>
<p>For those relying on a PC for work, it’s important to reflect on the history of computing technology. If it has taught us anything, it’s that old work adapts to new tools. Critics of the PC format said it would be useful if it could run mainframe code. Critics of tablets said they would be useful if they ran x86 applications. Neither of these perspectives are anything but entirely flawed. How you work, in the context of computing, will always evolve to take advantage of the largest scale technology. That’s where the obvious opportunity lies for increased revenue and innovation. In modern terms, that scale was Windows, but it has just become Android and iOS.</p>
<h2 id=reframing-what-real-work-is>Reframing what “real work” is</h2><p>IT professionals, another workforce minority, can’t feasibly do their work outside of traditional hardware norms — yet. But real work comes in many shapes and forms. Some 3 million people are employed as authors and editors in the USA, which excludes those who blog, students writing essays, and more.</p>
<p>Writing in length on a smartphone may sound like someone’s worst nightmare, yet amongst millennials and those in developing nations it is becoming commonplace. It’s becoming common not just because of economic factors, but technological and cultural ones, too.
Anything that can be done on PC will also be done on mobile, provided it’s intuitive and efficient enough. The truth is that something being intuitive is only perceived as so based on prior experience. The mouse didn’t originate as intuitive, but has become so. For Gen X designers, the idea of younger generations using a trackpad seems impractical, yet it too is happening. Our incredibly impressive 19 year old UI designer at Contact Light does just that.
Abroad, Viranch Damani from Chennai in India, an engineering student and technology journalist, recently Tweeted: “Paid 25k (USD $410) for laptop and use it like once in a week. Paid 6k (USD $100) for a smartphone and use 7 hours a day.” He writes using his Xiaomi Redmi 1S.</p>
<p>Today you can do everything from file tax, to produce vlogs and conduct video conferences on mobile. You can often do so with more pros than cons than on desktop. New ways of getting things done ends up happening just as effectively as the old, especially for natives of new tools.</p>
<h2 id=the-pc-is-dead-long-live-the-pc>The PC is dead. Long live the PC.</h2><p>All this said, there’s still room for the PC and tablet. They’re not dead, but neither is the mainframe. Some people say Elvis is dead, but he made $55 million dollars in 2015 — it’s just that his relevance continues to diminish.</p>]]></content:encoded>
    </item>

    <item>
      <title>Your business will be disrupted by autonomous vehicles</title>
      <link>https://skella.com.au/writing/your-business-will-be-disrupted-by-autonomous-vehicles/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/your-business-will-be-disrupted-by-autonomous-vehicles/</guid>
      <pubDate>Wed, 29 Jul 2015 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Work and strategy</category>
      <description>Engaged in conversation with colleagues about the future of transport today, I got talking about the impact that autonomous vehicles will have on the advertising industry. If that seems like a strange leap to make, bear…</description>
      <content:encoded><![CDATA[<p>Engaged in conversation with colleagues about the future of transport today, I got talking about the impact that autonomous vehicles will have on the advertising industry. If that seems like a strange leap to make, bear with me. In fact, if you think that fleets of driverless cars only endanger the jobs of taxi drivers, you’ll be surprised to realise that your bottom lines might be in danger as well, whatever your business.</p>
<p>To explain who is in the firing line, I want to start by explaining three key benefits of autonomous fleets— all being profoundly important improvements for human safety and productivity — before describing how each of those benefits is going to completely reshape our economy.</p>
<h2 id=road-fatalities-will-be-all-but-stamped-out>Road fatalities will be all but stamped out</h2><p>Right now deaths on roads are high. Staggeringly high. In the US, the number of fatalities that occur on roads is in the vicinity of 40,000 every year. It’s right up there with influenza.
Seat belts, traction control, brake assist and many other automotive innovations have seen our roads get progressively safer. Yet, fundamentally, road fatalities are caused by humans. A report from the Eno Center for Transportation says that up to 90% of crashes are the result of human errors, rather than equipment or infrastructure failures.</p>
<p>Relinquishing control to computers that don’t speed, don’t get distracted and can see in 360 degrees at all times has already been demonstrated to be safer than driving yourself. Google driverless cars have been involved in over a dozen incidents in their 6 years and 1.5 million kilometres of operation, all of which were the fault of a human driver.</p>
<p>That same report from Eno expects that road incidents will be cut to 1% of current levels, while Volvo has even boldly claimed that after 2020 no one will die in one of their ever again, partly thanks to their plans for autonomous operation.</p>
<p>This outcome is obviously a very good thing, but what will this heightened level of road safety mean for the economy?</p>
<ul>
<li>Car insurance: worth $198 billion in the US, the industry will need to brace itself for a massive downscale as a consequence of less car ownership and less incidents in them.</li>
<li>Medical practices: just in the US, a reduction of road incidents of this scale will see emergency rooms attending to millions fewer patients every year and hospitals would see many hundreds of thousands fewer people staying overnight. Then there’s the matter of reduced GPs visits and few scripts for prescription drugs.</li>
<li>Health insurance: as car-related injuries plummet, so will health insurer revenue, with the cost of cover shifting significantly downward.</li>
<li>Crash repairers: panel beaters won’t have much to do when incidents on the roads have been close to eliminated, becoming one of many small business types resigned to obsolescence.</li>
<li>Law practices: personal-injury lawyers will see car-related cases evaporate, as incidents and subsequent injuries on the road patter out.</li>
<li>Government: road offences will dramatically plummet, resulting in reduced revenue generation from fines. Autonomous cars don’t speed, don’t drink-drive, and don’t park illegally.</li>
</ul>
<h2 id=road-transport-will-be-incredibly-cheap-and-accessible>Road transport will be incredibly cheap and accessible</h2><p>Owning a car is really expensive. The American Automobile Association estimates that all-inclusive, it costs an average of $10,000 annually to run a car on the road. Taxis are a far-too-expensive alternative. UberX is cheaper, but still couldn’t be used as a cost-effective total replacement today. Looking ahead, what if that UberX didn’t need to pay a driver, insurance costs became insignificant, and cars were always close by? Suddenly not owning a vehicle presents a far better financial position for most people, without 2015’s inconveniences of not having your own.</p>
<ul>
<li>Car manufacturers: as autonomous fleets rise, car ownership will fall. With fewer consumer customers, car manufacturers will live and die based on their agreements with fleet operators. Dr. Alexander Hars, MD of Inventivio GmbH, has forecast that only 20% of the US population still own a car in 2030 and PricewaterhouseCooper has made an arguably outlandish prediction that the number of cars on US roads will have dipped by an astounding 99% — from 245 million to just 2.4 million — when automobile autonomy has been fully realised.</li>
<li>Public transport: why walk to the tram stop when a car will pick you up and drop you off for a comparable expense, with just a minute’s notice? A new report from the International Transport Forum bluntly states that “for small and medium-sized cities, it is conceivable that a shared fleet of self-driving vehicles could completely obviate the need for traditional public transport.”</li>
<li>Hire cars: technically you’ll still be hiring a car amongst autonomous fleets, but traditional hire car companies will cease to exist, at least in their current form.</li>
</ul>
<h2 id=traffic-congestion-will-become-a-thing-of-the-past>Traffic congestion will become a thing of the past</h2><p>When the autonomous car you’re in knows exactly where all of the others are and what their status is, routes can quickly be optimised automatically. Pair that with the ability to accelerate and brake together — like carriages on a train — and you have an extremely efficient network of vehicles that don’t suffer from the “rubber banding” we see when groups of cars take off and slow down now.</p>
<p>Additionally, ride-sharing will finally become efficient and practical. Car pooling has always been a highly economical and environmentally friendly way of commuting, but soon autonomous cars will be able to respond to pick-up requests along dynamic routes, generated by rider demand, with extreme effectiveness.</p>
<ul>
<li>Oil: putting the drastic reduction in car numbers aside, even the autonomous cars remaining will consume less fuel by way of efficient routing and operation, including lowering wind resistance with perfect following distances. In reality they’ll use no fuel by 2030, as most cars will be electric, although in a similar fashion they’ll use less of their battery than a human counterpart would.</li>
<li>Aftermarket parts and service: there will be fewer individuals needing to maintain cars in the future. For those that do, autonomous cars will apply less mechanical and tire wear than humans with every manoeuvre, prolonging periods between services.</li>
<li>Highway stop establishments: autonomous trucks don’t need to rest, while humans inside driverless cars can recline back and get some shuteye as they continue on their automated journey.</li>
<li>Construction: autonomous cars will be using existing roads far more effectively than human operated ones, ending the need for most future widening, expansion and new road projects.</li>
<li>Parking: somewhere to store your car at the end of a drive into town is big business, a $100 billion industry in the US alone. With far fewer cars and far fewer of those needing to remain parked in dense metropolitan settings, the car park as we know it today becomes obsolete.</li>
</ul>
<p>Beyond those knock-on effects are sure to be many more, including my initial remark about an industry seemingly far removed: advertising. Yet, as fewer people gather in transport hubs such as train stations and less people pay attention to roadside out-of-home adverts, even advertising will need to re-position itself to cater for this change in behaviour.</p>
<p>Amidst these changes are vast opportunities for money to be made, not just lost, which you will have probably have begun to imagine while reading this article. However, who makes money and who loses money in the future that has been described, will begin with the decisions that business leaders make today.</p>]]></content:encoded>
    </item>

    <item>
      <title>Mobile advertising is about to be fundamentally disrupted</title>
      <link>https://skella.com.au/writing/mobile-advertising-is-about-to-be-fundamentally-disrupted/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/mobile-advertising-is-about-to-be-fundamentally-disrupted/</guid>
      <pubDate>Fri, 26 Jun 2015 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Consumer technology</category>
      <description>When discussing emerging trends in mobile device use recently, I was asked what the future might look like and what changes might pose the greatest threats to existing business models. The industry that sprung…</description>
      <content:encoded><![CDATA[<p>When discussing emerging trends in mobile device use recently, I was asked what the future might look like and what changes might pose the greatest threats to existing business models. The industry that sprung immediately to mind was advertising, which is about to change forever, for three key reasons:</p>
<ol>
<li>A reduced visibility of traditional mobile ad placements</li>
<li>Less consumer reliance on web search, websites and apps for information</li>
<li>The rise of voice user interfaces paired with improved artificial intelligence</li>
</ol>
<p>That third point may sound somewhat far fetched, but it’s already here and will become a pillar of personal computing within 10 years.</p>
<p>Let’s explore these forward thoughts in a little more detail…</p>
<h2 id=were-approaching-peak-ads-on-mobile>We’re approaching peak-ads on mobile</h2><p>There’s an enormous number of mobile banner ads interrupting our journeys and taking up valuable pieces of screen real estate right now. By 2017 there’ll be less getting in the way, even if nothing else changed in technology except for the following ones.</p>
<ul>
<li>Apple will allow third party developers to create mobile Safari extensions and let users block banner ads in Safari on iOS 9</li>
<li>Numerous EU mobile carriers are talking about blocking ads on their entire mobile networks</li>
<li>For the fourth time, courts have ruled AdBlock Plus to be legal</li>
<li>Adblock Plus is working with businesses and universities to block ads on their networks to save bandwidth</li>
<li>Apple will no longer allow the use of “canopenURL” API to deploy targeted ads within apps</li>
<li>Adblock usage grew by nearly 70% between June 2013 and June 2014, which includes over a quarter of all US internet users polled.</li>
<li>A Firefox-based Adblock browser for Android has just entered beta</li>
</ul>
<p>To be clear however, a lot else is changing in technology. If that list of mobile-ad-averse initiatives didn’t exist, far fewer people will still be exposed to far less banner advertising on mobile screens very soon…</p>
<h2 id=deep-linking-and-predictive-computing-threaten-search-as-we-know-it>Deep-linking and predictive computing threaten search as we know it</h2><p>Spotlight in iOS 9 (and OS X El Capitan) will begin to steer users away from apps and websites for information discovery, effectively declaring war on traditional search, inclusive of the paid results and banner ads within it.</p>
<p>That same Spotlight screen you currently use to search your phone for apps and messages will soon search much, much more. Spotlight will retrieve results for sports scores, stock prices and YouTube videos. It will show you nearby places to eat and drink, present news headlines related to your keywords, and even display information from the depths of other apps without opening them.</p>
<p>Avoiding traditional search engines, websites and even other apps isn’t an approach exclusive to Apple. Google Now is getting smarter, increasingly able to surface more of what you need directly through Android, without having to find that information elsewhere. This include’s the company’s predictive computing innovation, “Now on Tap”, which allows you to directly retrieve information about anything already on your screen, without any external discovery process. An example of this might be cinema times for a specific movie, if someone had asked if you want to see it.</p>
<p>Google Now On Tap contextually understands what you’re after, skipping the need to search and browse.</p>
<p>Technologies like these, ones that learn to anticipate what you want and what you might do next, is in fact just the beginning of far more meaningful and frictionless AI…</p>
<h2 id=voice-user-interfaces-and-ai-will-mean-less-looking-literally>Voice user interfaces and AI will mean less looking, literally</h2><p>Picture cleaning up a spill to notice you’re low on paper towel. Without grabbing a device or moving to one, you simply say “Alexa, reorder paper towel,” and it’s then magically delivered to you.</p>
<p>That’s precisely what Amazon’s Echo enables, the company’s voice-based home computing solution that launches this month in the USA. Forming an important part of our IoT (Internet of Things) future, often sporting interfaces beyond sight and touch, Apple is likely not far behind with their next version of Apple TV.</p>
<p>Environmental computing aside, voice user interfaces with this same level of clever understanding are about to become commonplace on your smartphones and wearables, with improved versions of Siri and Cortana. I’ll happily admit that the only real utility and convenience I get out of my Apple Watch is the ability to ask questions, get directions and set geo-based reminders without needing to look at my smartphone — this use case is about to get more practical and more powerful.</p>
<p>Requests like “purchase a Bose SoundLink Mini at the best price but that can be delivered by 4pm” aren’t far off, thanks to the progress being made in contextual computing and natural language processing. This will critically shift eyeballs away from search, from web, from apps and more generally from personal computing screens. It will do so in a far more profound way than any other threat to mobile and web advertising.</p>
<h2 id=where-does-advertising-go-from-here>Where does advertising go from here?</h2><p>These thoughts aren’t unaccompanied. Jack Matthews, director at APN Outdoor Media, thinks publishers have a right to be worried. Speaking to the Sydney Morning Herald, Matthews said that “most advertising, as a revenue model, is under threat for a variety of reasons,” when quizzed on the effect of these changes.</p>
<p>The creators of Viv, a new AI from the team behind Siri, are more than aware of their threat to the industry as it exists. If you’re asking for something and the best possible answer or most suited product is delivered, there’s no space for a sponsored message. Esquire recently interviewed Viv co-founder, Dag Kittlaus, and touched on the same subject, who stated boldly that “business models will change”. As succinctly suggested in response to Kittlaus, that may just turn out to be the biggest understatement of the Internet age.</p>
<p>All of this technological change won’t have to detrimentally impact the enormous forecasts for mobile advertising revenues over coming years. However, if mobile ad spend really is to far surpass a majority share of all digital ad expenditure, we’ll have to renew interest and effort in bolstering DOOH (digital out-of-home) advertising capabilities. In particular, intelligent communication between DOOH placements and devices of passersby. Finding ways to reach individuals directly, with truly personal, timely and geographical relevancy, is of higher priority than ever before.</p>
<p>With the acceptance that consumers will be exposed to far fewer display ads in their personal computing lives soon, comes the realisation of that required trajectory. Ultimately, the future of advertising lies with those capable of imagining new, sophisticated ways of engaging audiences that extend well beyond the ones imagined today.</p>]]></content:encoded>
    </item>

    <item>
      <title>Our virtual reality future starts with the Gear VR</title>
      <link>https://skella.com.au/writing/our-virtual-reality-future-starts-with-the-gear-vr/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/our-virtual-reality-future-starts-with-the-gear-vr/</guid>
      <pubDate>Tue, 06 Jan 2015 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Consumer technology</category>
      <description>Delivered via a US forwarding address, my Samsung Gear VR arrived in Melbourne yesterday. With great anticipation, I flashed my Telstra Note 4 with Virgin’s firmware and proceeded to get lost in unfamiliar worlds for…</description>
      <content:encoded><![CDATA[<p>Delivered via a US forwarding address, my Samsung Gear VR arrived in Melbourne yesterday. With great anticipation, I flashed my Telstra Note 4 with Virgin’s firmware and proceeded to get lost in unfamiliar worlds for the afternoon.</p>
<p>So, was it the virtual reality experience I envisioned would be one day possible, after first trying an Oculus Rift? Could it begin to profoundly shift philosophy, the economy and our society in ways we can’t even begin to dream of? No, not even close. But, I knew that already.</p>
<p>Much like the Oculus developer kits, the Gear VR is merely a glimpse into a technology that WIRED say will “change gaming, movies, TV, music, design, medicine, sex, sports, art, travel, social networking, education, and reality.” It’s a grandiose statement that I agree with, but for now I won’t use it for much more than a personal 400 inch travelling movie screen. At times I might even send friends on a jaw-dropping tour of the solar system without needing to sit them down next to my computer.</p>
<p>So, why my heightened anticipation for the Gear VR? Virtual reality hasn’t progressed far with this iteration, but finally, it’s being made available for the general public’s consumption. Despite the label of “Innovators Edition”, you don’t have to be an innovator to try it in retail stores or see it plastered across Melbourne billboards and bus stops.</p>
<p>That promotion of presence is a big deal for virtual reality, so as a designer and strategist I wanted to dive in and see what the Gear VR would mean for today’s market.</p>
<h2 id=its-expensive-and-undercooked>It’s expensive and undercooked</h2><p>The Gear VR isn’t actually any more affordable than previous Oculus offerings. It requires an investment of about $1300, once you’ve included the Samsung Note 4 smartphone. With the money spent on Gear VR, enthusiasts could build a gaming PC and buy a DK2 for far higher fidelity immersion. Pairing it with a racing wheel and leaning into apexes as you speed around a nearly photo realistic Silverstone circuit just isn’t possible with the power of the Samsung Note 4, nor lack of positional tracking with the Gear VR.</p>
<p>However, a key point of difference using the Gear VR is that you can put yourself inside of a virtual world as far away from a traditional computer as you like. Sitting on a virtual beach with a real breeze at your back and real sun on your face is infinitely more believable than if you’re seated in a computer room.</p>
<p>For anyone expecting that the Gear VR’s 1440p screen would yeild vast improvements, bettering the DK2's 1080p, the resolution benefit is unfortunately undermined by other factors. A slightly reduced field-of-vision, 15Hz lower refresh rate and a difference in optics is where it falls down. The result is more peripheral vision blurriness than I’d had hoped for and slightly more flicker.</p>
<h2 id=theres-not-much-to-do-with-it>There’s not much to do with it</h2><p>Beyond the barrier of price, if you don’t care for playing games or using it to consume media on your lonesome, there’s the question of what you would even do in virtual reality. Let alone if it you could step in from anywhere.</p>
<p>You should be excited to know that in the near future NextVR plans to offer virtual seats at live music and sporting events. Here in Australia, Realestate.com.au is building a way to let home buyers walk through houses of interest with it as well. Unfortunately there is nothing as compelling as any of that for the Gear VR right this minute.</p>
<h2 id=be-excited-anyway>Be excited anyway</h2><p>Samsung’s Gear VR won’t be the device that lands virtual reality in the homes of average consumers in high volumes, or a catalyst for reinventing industries. It is however a landmark device for virtual reality. Not because of how good or bad it actually is, but because for many people it’s the first virtual reality experience that is real.</p>
<p>This newfound exposure and availability is of unprecedented importance for virtual reality, along with those who hope to change the world with it. The future is up for grabs, by those who can imagine it.</p>]]></content:encoded>
    </item>

    <item>
      <title>The Bottom Line Value of Design</title>
      <link>https://skella.com.au/writing/the-bottom-line-value-of-design/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/the-bottom-line-value-of-design/</guid>
      <pubDate>Tue, 17 Jun 2014 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Design</category>
      <description>There has been an ever-brightening spotlight put on design and its business value over the past decade. That attention has been earmarked by the sweeping successes of companies who invest in it. I’m not talking about…</description>
      <content:encoded><![CDATA[<p>There has been an ever-brightening spotlight put on design and its business value over the past decade. That attention has been earmarked by the sweeping successes of companies who invest in it. I’m not talking about just brand and interface, but a broader definition of design. I’m talking about design as informed planning and deliberate decisions regarding how something looks, how it works and how it makes you feel.</p>
<p>Now more than ever before there’s an electric bustle amidst the startup scene where design is considered critical, often playing an instrumental role in driving success. But it’s not just startups that are reaping the rewards. Small and large, new and old, companies around the globe are attaching themselves to the importance of user experience, becoming agile, moving design as styling to design as strategy, and cultivating user-centred ways of operating. China’s wildly successful Alibaba is about to go public in the US and is expected to raise more than the Facebook share sale, by putting the shareholders last in their customer-first value system.</p>
<p>The very best businesses are wielding the potential of design thinking to disrupt and differentiate with great success. They’re raising expectations for beauty and utility, empowering people and creating efficiencies in ways that were just recently science fiction.</p>
<p>You only have to look as far as the uptake that Xero’s accounting software is seeing, or the global impact Uber’s ride-sharing service is having, to garner some appreciation for the power of design both in form and function. Their efforts are a stark reminder that the need for design isn’t just to make things look pretty, or even improve usability. Design is delivering value through competitive advantage, increased productivity and ultimately profitability. An evangelist for that business reality, co-founder and Head of Design at Xero, Philip Fierlinger, says, “we’re not a software company. We’re a user experience company.”</p>
<h2 id=the-effect-of-design-on-company-performance-and-the-stock-market>The effect of design on company performance and the stock market</h2><p>As mentioned in my recent article “What UX isn’t”, a 2013 study revealed that design-led companies have outperformed the S&P 500 by an immense 228% over the past 10 years. That number was arrived at by tracking the value of the only 15 companies amidst the S&P that met a strict criteria, including the embedding of design leadership at senior levels and utilising design as an innovation resource. The names of those 15 companies are largely unsurprising, being Apple, Coca Cola, Ford, Herman-Miller, IBM, Intuit, Newell-Rubbermaid, Nike, Procter & Gamble, Starbucks, Starwood, Steelcase, Target, Walt Disney and Whirlpool.</p>
<p>In another S&P comparative analysis, Watermark Consulting created two stock portfolios based on Forrester Research’s Customer Experience Index, comprised of the Top 10 and Bottom 10 publicly traded companies ranked on it. From 2007 to 2012, a six-year period that spans recession and recovery periods, the Top 10 customer experience leaders outperformed the S&P with close to triple the returns, at a cumulative total of +43%. The Bottom 10 generated a negative cumulative total return of -33.9%.</p>
<p>Over in Britain, the UK Design Council told us in 2013 that design is linked to profit, and that for every £1 spent on design, businesses see a £4 increase in net operating profit. Back in 2004, the UK Design Council published another report, which found that design-aware companies analysed on the London Stock Exchange from 1994-2003 outperformed FTSE 10 and FTSE All Shares indexes by more than 200%.</p>
<p>Inspired by that 2004 report, the founders of experience design agency Teehan + Lax created the UX Fund in 2006. In that fund were 10 chosen companies who possessed a history of innovation and demonstrated care in the design of their products, such as Google and Netflix. They invested $50,000 of their own money and saw a maturation of +39.3% in the first year alone. By 2011 it had matured a huge +101.8% before the holdings were sold.</p>
<p>I asked agency co-founder Geoff Teehan what the fund might look like now, in 2014. He told me it would be up approximately 205%, but even that would be a conservative estimation as it doesn’t deal with multiple splits.</p>
<h2 id=the-effect-of-user-research-on-design-outcomes>The effect of user research on design outcomes</h2><p>Braden Kowitz, a design partner at Google Ventures, recently published an article on The Wall Street Journal on the subject of becoming great at design by listening to your customer. In it he cited one of the Internet’s most popularly misappropriated Henry Ford quotes, being “If I had asked people what they wanted, they would have said faster horses.” He insinuates that those questions must be asked, and if the answer really was “faster horses,” the interpretation of that insight should be that people feel that getting around is too slow.</p>
<p>Understanding real problems and then designing the right solutions, instead of speculated ones, should make obvious sense to everyone. Yet, common pitfalls continue to be solutions designed by the people who build them without consideration of real users, solutions designed based on the anecdotal ideas of management, solutions designed to match competitors, or solutions designed for new technology with no other reason than “it’s now possible”. Echoing this sentiment was the late Steve Jobs, “you’ve got to start with the customer experience and work back toward the technology — not the other way around.”</p>
<p>O2 learned that the hard way, as version 3 of the “My O2” app was designed from a technical rather than a user perspective, consequently being received poorly by the British telco’s customers. When later turning to a new “Customer-Centred Design” process that began with user needs research, future versions went on to save O2 millions of pounds in through reduced support call volumes. Rogers, one of Canada’s largest telecommunications companies, started an intranet redevelopment project by first observing usage and quantifying how long it took staff of to complete tasks. After addressing learned inefficiencies with a new design, employees were saving on average 22 minutes per week, enabling productivity gains worth in excess of $10 million per year. Then there’s Microsoft, who by testing Bing link colours decided on a very specific shade of blue that created an additional $80 million in annual revenue.</p>
<p>While it may seem like all of the required research for informed planning would slow delivery time and increase cost, in fact being user-centred from the very beginning of every design process greatly reduces substantial rework, reveals revenue opportunities and lowers customer support volumes. Tom Gilb famously stated in his timeless reference text, titled “Principles of Software Engineering Management”, that the rule of thumb in many usability-aware organisations is a that a $1 investment in usability returns $10 to $100. He writes that once a system has been released it costs 100 times as much to fix relative to prior.</p>
<p>There’s no longer any doubt about it, the evidence has well and truly mounted. The value of design has grown so bright that it’s now unavoidably blinding. For those who recognise its importance and act upon it, there’s the potential for transformative outcomes that amass advocacy, improve performance metrics of all kinds and propel bottom lines upward. For everyone else, this poignant remark of Ralph Speth, Jaguar CEO, is worth contemplation, “if you think good design is expensive, you should look at the cost of bad design.”</p>]]></content:encoded>
    </item>

    <item>
      <title>What UX isn’t</title>
      <link>https://skella.com.au/writing/what-ux-isnt/</link>
      <guid isPermaLink="true">https://skella.com.au/writing/what-ux-isnt/</guid>
      <pubDate>Fri, 11 Apr 2014 00:00:00 +0930</pubDate>
      <dc:creator>Jamie Skella</dc:creator>
      <category>Design</category>
      <description>There’s still many common misconceptions about what user experience design (UX, UXD) is amidst many of the companies that need and want to invest in it.</description>
      <content:encoded><![CDATA[<p>There’s still many common misconceptions about what user experience design (UX, UXD) is amidst many of the companies that need and want to invest in it.</p>
<p>Highlighting the impact of good design, what we’ve seen over the past ten years is that design driven businesses have outperformed America’s Standard & Poor’s 500 large publicly traded companies — by a massive 228%.</p>
<p>Characteristics of design driven businesses are the embedding of design leadership at the highest organisational levels, and a top-down commitment to using design as a catalyst for innovation. However, the “design” in “design driven” isn’t what most people think about when the term is mentioned.</p>
<h2 id=ux-isnt-just-ui>UX isn’t just UI</h2><p>Not to be confused with UI specifically, UX design isn’t merely about an interface or pretty pictures, rather the bigger picture. Paraphrasing Steve Jobs; design isn’t just how it looks, it’s how it works.</p>
<p>To that point, user experience is indeed design, but it’s the definition of solutions far beyond visual and interface design. Making informed and intelligent design decisions means the inclusion of user research and usability testing, as well as an understanding of behavioural psychology and human-computer interaction (HCI) principles.</p>
<h2 id=ux-isnt-just-usability>UX isn’t just usability</h2><p>Usability and the testing that goes along with it are definitely key in what make up a successful UX solution. Yet being usable is only one of many qualities of a good interface — visual, voice, or otherwise — not the user experience in its entirety.</p>
<p>Beyond being simple to learn and creating efficiencies in task completion, good UX design requires pragmatic creativity, an understanding of colour theory, and interaction design capable of eliciting positive emotional responses. Being ‘usable’ alone doesn’t make an interface desirable, delightful or useful, of particular distinction.</p>
<h2 id=ux-definitely-isnt-just-common-sense>UX definitely isn’t just common sense</h2><p>As ambitious and often tech savvy professionals, our understanding of technology and how we use it differs greatly from a typical end user of any given product. Therefore our personal expectations and anecdotal experiences are largely meaningless. Common sense is not common, and UX decisions should rely on building consensus.</p>
<p>Good design requires a deep understanding of your target demographic, only attained through quality data, being a result of unbiased research and testing.</p>
<p>Designing for yourself is an easy trap to fall into. Even when wielding taste and best practice acknowledgement, doing so is a sure fire way to get it wrong for your target demographic.</p>
<h2 id=so-what-is-ux>So, what is UX?</h2><p>UX is the consideration of the many aspects of a user’s interactions with a product or service. It’s concern for the relationship between those interactions, which ultimately define a user’s perception of a brand as a whole.</p>
<p>More than just a new word for common sense, interface design, or usability, UX is the combination of disciplines and practices mentioned above. Let alone those surrounding information architecture, motion design and product strategy, to name just a few.</p>
<p>As for the UX title itself, above all else a good practitioner is able to acutely empathise with the audience they’re designing for.</p>
<p>Truth be told, my observed reality across many organisations over the past decade is that UX is described and utilised very differently, based on the people and process involved. My personal precept for every project has long been that good UX is the result of understanding the customer, seizing technological opportunity, and pursuing simplicity.</p>]]></content:encoded>
    </item>

  </channel>
</rss>
