A position paper from GlassCase.org

Making Integrity Visible

Government decisions affect millions of Australians.
But if you cannot see how those decisions are made,
how do you know they were made properly?

Process Discretion Decision Delay Review

GlassCase · glasscase.org · Civic-legal data lab
February 2026

Scroll to read
The problem · Invisible architecture

The architecture of government decisions is invisible to the people those decisions affect.

Government agencies make thousands of decisions every year. FOI requests. Migration decisions. Licensing refusals. Disciplinary proceedings. Administrative reviews.

The law that governs these decisions exists. It is written down. In many cases, it works. But the architecture of these decisions — how they are structured, where discretion lives, where mandatory considerations are applied or missed — is invisible to almost everyone outside the system.

Rules live in Acts, policies, ministerial directions, and agency manuals. There is no shared map. No shared language. No way for a researcher, journalist, or affected person to see the full picture.

The result: decisions that fail review not because the decision-maker got the law wrong, but because a required consideration never appeared in the reasons. Delays that breach statutory timelines with no practical consequence. Redaction patterns that look chaotic from the outside but follow hidden internal logic.

This is not a problem of bad people. It is a problem of invisible architecture.

Legal scholar Lawrence Lessig argued that behaviour is shaped not just by law, but by architecture — the design of the systems people move through. A speed bump slows traffic the same way a speed limit does, but without a single word. Administrative systems work the same way. Their design determines who can participate, who gets heard, and who gives up.
Evidence & sources

Lessig, L. (2006). Code: Version 2.0. Basic Books. Lessig identifies four modalities of regulation — law, norms, markets, and architecture — and argues that architecture often regulates behaviour more effectively than written rules. lessig.org

The FOI Act 1982 (Cth) mandates consideration of specific factors (e.g., s 11, s 15, s 22). The Administrative Decisions (Judicial Review) Act 1977 (Cth) (ADJR Act) provides judicial review for procedural failures. Mandatory considerations derive from Minister for Aboriginal Affairs v Peko-Wallsend Ltd (1986) 162 CLR 24.

What it looks like · Transparency

Information is released. But most of the process stays invisible.

Think about what happens when an agency releases documents under FOI. The agency publishes numbers — how many requests it received, how many it processed on time, how many were refused.

But those numbers don't tell you the important things. They don't show why certain documents were redacted. They don't reveal whether the agency actually considered everything it was legally required to consider. They don't map the gap between how the process is supposed to work and how it actually works.

Reported
Not visible
↑ Request counts, timeliness stats ↑ Decision reasoning, redaction logic, discretion patterns

Researchers Archon Fung, Mary Graham, and David Weil studied this pattern across eight transparency policies in the United States. They found that transparency works like a chain: information is disclosed, people understand it, they change their behaviour, and then disclosers respond. When any link in that chain breaks, transparency fails.

A recurring failure point? The information is released — but in a form that most people cannot understand or act on. The chain breaks at comprehension.

Information disclosed
Users understand it
Behaviour changes
System improves
The transparency action cycle — adapted from Fung, Graham & Weil (2007). The chain breaks at comprehension.

Australia's FOI system hits this exact failure point. Agencies disclose. But the information arrives in a form that only specialists can interpret. The people the system is supposed to protect never reach the stage where they can act.

Evidence & sources

Fung, A., Graham, M. & Weil, D. (2007). Full Disclosure: The Perils and Promise of Transparency. Cambridge University Press. The "Targeted Transparency Action Cycle" (Ch. 4) maps the chain from disclosure → comprehension → behavioural change → system improvement. doi.org

Office of the Australian Information Commissioner (OAIC) annual reports publish aggregate Freedom of Information (FOI) statistics but do not disaggregate decision quality, redaction reasoning, or compliance with mandatory considerations.

Who it hurts · Burden

Complexity is not neutral. It decides who gets to use the system.

When a process is hard to understand, most people give up. That sounds like a personal choice, but it is actually a design outcome. Researchers Pamela Herd and Donald Moynihan call this administrative burden — the idea that complicated processes function as invisible policy, deciding who benefits and who doesn't.

They identify three costs that complex systems impose on ordinary people:

🔍 Learning costs Figuring out your rights and how to use them
📋 Compliance costs Forms, evidence, deadlines, procedural steps
🧠 Psychological costs Stress, confusion, stigma, loss of autonomy

The harder it is to see how the system works, the higher each of these costs becomes. And they fall heaviest on people who have the fewest resources to absorb them — people without lawyers, without experience of government, people with disability or cognitive differences, people already dealing with crisis.

The harder the system is to understand, the fewer people can use it. That is not a side effect of complexity. It is what complexity does.
Evidence & sources

Herd, P. & Moynihan, D. (2018). Administrative Burden: Policymaking by Other Means. Russell Sage Foundation. The framework of learning, compliance, and psychological costs appears in Ch. 1–2. Distributive impacts are documented across U.S. federal programs. russellsage.org

The Law Council of Australia's Justice Project (2018) identified legal complexity as a primary barrier to access to justice in Australia, with particular impact on people with disability, Aboriginal and Torres Strait Islander people, and people in regional areas.

The gap · Compliance vs. accountability

Following the rules is not the same as being accountable.

Agencies report to Parliament. They publish annual reports. They list how many FOI requests they processed and how many were on time.

This looks like accountability. But it is compliance — and the two are not the same thing.

Compliance asks: "Did we follow the steps?" Accountability asks: "Can the people we serve tell whether we followed the steps?"

Right now, agencies report their own performance using their own categories. There are no independent, public tools that let outsiders check the quality of decisions — not just the quantity. No tools that show where discretion clusters, where redactions are over-applied, or where delays are routine rather than exceptional.

Mark Moore's public value framework argues that government agencies exist to produce value for the public — not just to follow their own procedures. If agencies cannot demonstrate that their processes actually serve the people they affect, the processes lack public value no matter how efficiently they run.
Evidence & sources

Fung, A., Graham, M. & Weil, D. (2007) distinguish "transparency for compliance" from "targeted transparency" designed to change behaviour.

Moore, M.H. (1995). Creating Public Value: Strategic Management in Government. Harvard University Press.

Moore, M.H. & Khagram, S. (2004). "On Creating Public Value." Corporate Social Responsibility Initiative Working Paper No. 3, Harvard Kennedy School. hks.harvard.edu

The argument · What changes

When you can see the system, three things change.

1

Patterns become visible.

Where do redactions cluster? Which agencies miss deadlines? Where is discretion exercised without legally required considerations? Diagnostic tools turn scattered anecdotes into evidence.

2

Pressure points become clear.

Reformers, journalists, and oversight bodies can focus on the parts of the system that are actually failing — not just the parts that make headlines.

3

Agencies can improve.

Diagnosis is not punishment. When an agency can see where its own processes are breaking, it can fix them. Visibility serves the system, not just its critics.

A system that cannot see its own failures cannot fix them.

Evidence & sources

Australian Law Reform Commission & Administrative Review Council, Open Government: A Review of the Federal Freedom of Information Act 1982, ALRC Report 77 (1995), [2.3]. The Review found that restricted access to government information produces feelings of 'powerlessness' in citizens and that access is 'the currency' required for participation in governance. See also ALRC Report 112 (2009), Secrecy Laws and Open Government in Australia, [2.10]–[2.12], [2.82].

Fung, A., Graham, M. & Weil, D. (2007, Ch. 7) describe "collaborative transparency" — systems where users, disclosers, and government all contribute to improving information quality over time.

Lessig, L. (2006) argues that making architectural choices visible is a precondition for democratic oversight. When regulation operates through design rather than explicit rules, it escapes scrutiny unless the design itself is made legible.

Identity · What GlassCase is

A diagnostic layer for Australian civic-legal systems.

GlassCase is a civic-tech lab that maps how Australian law works in practice. Not how it reads in a statute — how it actually operates when someone lodges a request, challenges a decision, or waits for a response that never comes.

We build open, interactive tools that make the hidden architecture of administrative decisions visible. Where does discretion sit? Where do delays cluster? Where are redactions applied, and why?

GlassCase is

A civic-tech lab that maps real administrative and legal pathways. An open-access research platform — tools are free to use. A diagnostic layer that reveals how decisions are structured, not whether specific decisions were right or wrong.

GlassCase is not

A law firm or legal advice service. An advocacy organisation that campaigns for particular outcomes. A replacement for professional legal representation. An AI product. Tools are built from legislation and case law. AI assists in the development process; the founder designs, verifies, and takes responsibility for every output.

Founded and led by Jay Spudvilas — an education leader with twelve years of public governance experience and a Juris Doctor student at the Australian National University (ANU).

Commitments · How we work

Six commitments that shape everything we build.

1

Make what is hidden visible.

If a process affects people, they should be able to see how it works — the rules, the timelines, the discretion points, and the review options.

2

Work from real data.

Every tool is built from legislation, case law, or publicly available administrative data. No tool is released without a traceable evidence base.

3

Open by default.

Tools and frameworks are freely available. Research outputs are published with DOIs through Zenodo. Policy submissions are on the public record.

4

Map the system, not the person.

GlassCase maps how decisions are structured. It does not profile individuals, track personal data, or predict specific outcomes.

5

Design for the people affected.

If a tool only makes sense to lawyers, it hasn't finished its job. Public translations are delivered through LightKey.

6

Name what you don't know.

Where evidence is limited, we say so. We use version numbers because tools improve. v0.1 means functional but early.

What we build · Tools

Open diagnostic tools built from legislation and case law.

Every tool is open-access, structured for reuse, and published with DOI links on Zenodo. The goal is infrastructure, not gatekeeping.

Available now · v0.1

Consideration Matrix

Maps mandatory and discretionary considerations in administrative decisions. Built from Peko-Wallsend and Li. Shows what the statute required against what appeared in reasons.

Available now · v0.2

Redaction Taxonomy

Structured coding scheme for analysing FOI redactions and decision quality. Includes the FOI Redaction Logic Visualiser for mapping exemption pathways.

Available now · v0.1

Institutional Stress Mapper

Maps institutional stress signals over time. Score subsystems, detect patterns, identify where deeper inquiry may be needed — before crisis.

In development

FOI Process Visualiser · Policy-Decision Heatmap · Procedural Fairness Simulator

Next-generation tools: map each delay point in an FOI request, show where discretion clusters across departments, and model how small rule changes affect outcomes.

We also publish essays (Findings), short-form commentary on single legal concepts, and policy submissions — including Senate Submission 38 on the FOI Amendment (Reform) Bill 2025.

Evidence & sources

GlassCase Consideration Matrix (v0.1) — DOI 10.5281/zenodo.18087523

GlassCase FOI Redaction Taxonomy (v0.2) — DOI 10.5281/zenodo.18062442

GlassCase Institutional Stress Mapper (v0.1) — DOI 10.5281/zenodo.18522839

Sweller, J. (1988). "Cognitive Load During Problem Solving." Cognitive Science, 12(2), 257–285. GlassCase tools are designed to reduce extraneous cognitive load while preserving intrinsic complexity.

The public layer · LightKey

GlassCase diagnoses. LightKey translates.

Diagnostic layer

GlassCase.org

Open tools that map how administrative decisions are made — where discretion lives, where delays cluster, where redactions are applied. For researchers, oversight bodies, journalists, and policy designers.

Public layer

LightKey.org

Step-by-step pathway guides for people who deal with government processes. Designed to be clear and navigable, with particular attention to neurodivergent people and people with disability. A GlassCase initiative.

Founding principle

Fairness by Design

How a process is designed shapes whether it is fair. Making a process visible is the first step to making it fair. Making the system visible is the step that makes reform possible.

Transparency that only reaches professionals is not yet finished.

But what about... · Testing the argument

Four honest objections.

Objection

"Isn't this what auditors and ombudsmen already do?"

Oversight bodies play a critical role. But they work case by case, with limited resources, and often after harm has occurred. GlassCase builds diagnostic infrastructure — tools that make patterns visible across many cases at once, so that oversight bodies, journalists, and researchers have better evidence to work with. It supports oversight. It does not replace it.

Objection

"Won't this undermine trust in government?"

This assumes trust is built by keeping problems out of view. It is not. A health system that publishes its error rates and shows how it is reducing them earns more trust than one that claims it makes no errors. The same applies to administrative systems.

Objection

"Agencies already publish annual reports and statistics."

They do. But aggregate numbers tell you how many decisions were made — not whether they were made well. An agency can process 95% of FOI requests "on time" while routinely over-redacting, ignoring mandatory considerations, or issuing decisions that cannot withstand review. GlassCase measures decision quality, not just throughput.

Objection

"People might misuse diagnostic tools."

This assumes complexity is a useful filter. It is not. Complexity filters out people who do not already know how the system works — not people with weak arguments. If making the system visible increases scrutiny, that tells us the real level of concern was always higher. GlassCase maps the system, not the person.

Evidence & sources

Fung et al. (2007) document how aggregate disclosure often fails the action cycle test. Information satisfies legal requirements but is too abstract for users to act on.

Herd & Moynihan (2018) note that systems often measure process volume rather than outcome quality, creating an illusion of accountability.

Who we serve · Audience

Built for people who need to see how the system works.

Researchers & academics

Structured diagnostic tools and DOI-registered frameworks for studying administrative decision-making.

Oversight bodies

Maps of where discretion, delay, and redaction patterns cluster across agencies and jurisdictions.

Policy workers

Evidence for reform design. Senate Submission 38 demonstrates applied use in legislative consultation.

Legal practitioners

Case-law-grounded frameworks that link doctrine to procedure. Built for practice and teaching.

Journalists

Open tools for investigating how administrative decisions are structured across agencies.

General public

Plain-language pathway guides via LightKey that show how government processes work and what your rights are at each step.

What needs to happen · Action

Visibility is not a bonus. It is infrastructure.

Policymakers

Require agencies to publish structured decision data — not just aggregate statistics. Make the reasoning behind decisions reviewable, not just the outcomes.

Oversight bodies

Adopt diagnostic tools that measure decision quality, not just volume. Use pattern-level evidence to identify systemic failures before they require individual complaints to surface.

Researchers

Study the link between process design and access to justice. If clearer systems produce more complaints, the real level of concern was always higher.

Agencies

Treat legibility as part of your public value — not a compliance afterthought. A process citizens cannot follow fails its own purpose, no matter how correctly it is administered.

Evidence & sources

The Australian Government's Data and Digital Government Strategy supports community partnerships to improve service delivery and public access to government information.

Herd, P. & Moynihan, D. (2018) recommend auditing systems for burden — examining learning, compliance, and psychological costs as measurable design variables.

Moore, M.H. (1995) argues that public value requires outcomes citizens can recognise as valuable. Legibility is a precondition for public value in administrative systems.

Boundaries · Safety, ethics, and limits

What GlassCase will not do.

GlassCase publishes information, not legal advice. Tools are diagnostic — they show how decisions are structured, not what you should do about a specific decision.

We do not collect personal data beyond basic analytics (with consent). We do not profile users. We do not sell data. Our tools are built from publicly available sources. Where evidence is uncertain or limited, we say so.

We will not provide legal advice or tell you what decision to make
We will not access or store personal case information
We will not make predictions about the outcome of specific decisions
We will not claim authority beyond our evidence base
Next steps · Invitation

GlassCase is open. The tools are free. The research is published.

If you are a researcher, oversight body, policy unit, or university interested in collaboration — get in touch. If you want to use the tools, they are on glasscase.org. If you want the plain-language version, it is on lightkey.org.

If you see something we got wrong, tell us. If you have data that could make a tool better, share it.

Collaborate

Partner on research or policy projects.

Contribute

Share data, code, or design expertise.

Pilot

Explore institutional partnerships.

GlassCase exists because the architecture of government decisions is invisible to the people those decisions affect.

We make that architecture visible — not to campaign for particular outcomes,
but because fairness requires it.

If you cannot see how a decision is made,
you cannot tell whether it was made properly.

Making integrity visible is not optional. It is infrastructure.

This paper uses the same design principles as GlassCase: one idea at a time, evidence available but not blocking the path, complexity preserved but navigable. If it was clear, the principle holds.