The law firm of choice for internationally focused companies

+263 242 744 677

admin@tsazim.com

4 Gunhill Avenue,

Harare, Zimbabwe

California Bar Reveals It Used AI For Exam Questions, Because Of Course It Did – Above the Law

Unaware
that
you’re
supposed
to
rip
the
band-aid
off
all
at
once,
the
embattled
officials
behind
the
California
Bar
Exam
decided
they
hadn’t
had
enough
of
the
non-stop
cavalcade
of
disastrous
headlines
about
February’s
exam,
so
they
just
casually
threw
out
there
that
they
used
AI
to
come
up
with
some
of
the
questions.

Everyone’s
going
to
be
chill
about
this,
right?
I
mean,
“bar
exams”
and
“artificial
intelligence”
are
two
topics
that
famously
elicit
only
only
rational,
measured
reactions.

After
February’s
haphazard
glitch-fest
was
the
product
of

a
shotgun
wedding
revamp

that
ditched
the
NCBE
in
an
effort
to
save
the
state
licensing
operation
from
descending
into
bankruptcy.
The
NCBE’s
testing
venue
rules
had
pushed
California’s
resources
to
the
limit
with
the
most
populous
state
in
the
union
forced
to
book
massive,
expensive,
and
mostly
inconvenient
locations
every
time
it
offered
the
test.
A
new
provider
would
offer
the
examiners
the
opportunity
to
do
more
remote
testing
and
take
advantage
of
multiple,
smaller
locations
to
save
money.
It
sounded
good
on
paper
and,
more
or
less,
it’s
still
a
better
path
forward
for
California.

But
things
didn’t
quite
work
out
this
time.

They
commissioned
Kaplan
to
write
the
questions
and
Meazure
to
administer
the
test.
Then
there
were
practice
test
problems,
technical
problems
galore,
the
more
convenient
venues
didn’t
materialize,
and
the
proctoring
spawned
subreddits
worth
of
horror
stories.
They
ended
up
having
to
offer
refunds
and
make-up
tests
before
floating
the
possibility
of

inviting
more
chaos
by
throwing
the
whole
project
in
the
can
.

Making
this
the
perfect
time
to
inform
the
not-at-all-stressed
applicants
that
the
questions
might
have
also
been
hallucinated
AI
slop.
THANKS,
BYE!


According
to
the
LA
Times
:

The
State
Bar
of
California
said
in
a
news
release
Monday
that
it
will
ask
the
California
Supreme
Court
to
adjust
test
scores
for
those
who
took
its
February
bar
exam.

But
it
declined
to
acknowledge
significant
problems
with
its
multiple-choice
questions

even
as
it
revealed
that
a
subset
of
questions
were
recycled
from
a
first-year
law
student
exam,
while
others
were
developed
with
the
assistance
of
AI
by
ACS
Ventures,
the
State
Bar’s
independent
psychometrician.

Despite
what
it
sounds
like,
psychometricians
are
not
villains
from
an
L.
Ron
Hubbard
book.
But
they
also
are
not
lawyers,
understandably
troubling
educators
and
applicants
when
they
learned
that
a
bunch
of
non-lawyers
used
AI
to
develop
questions.
Basically,
their
job
is
to
measure
test
performance,
not
use
ChatGPT
to
rewrite
“Contracts
for
Dummies.”

Before
rushing
to
rage,
the
AI-aided
questions
made
up
a
“small
subset”
of
the
exam,
amounting
to
23
of
the
171
scored
multiple-choice
questions.
It’s
also
worth
noting
that
developing
questions
isn’t
the
same
as
handing
over
test
design
to
the
computers.
The
chair
of
the
State
Bar’s
Committee
of
Bar
Examiners,
Alex
Chan,
pushed
back
against
that
prospect,
explaining,
“the
professors
are
suggesting
that
we
used
AI
to
draft
all
of
the
multiple
choice
questions,
as
opposed
to
using
AI
to
vet
them.”

Over
and
above
whatever
it
means
for
AI
to
“vet”
them,
the
bar
also
said
that
all
questions
were
reviewed
by
subject
matter
experts.
Of
course,
those
same
panels
suspiciously
lost
a
few
credentialed
law
professors
after
the
Bar
worried
that
academics
who
worked
with
the
NCBE
in
the
past
could
raise
copyright
issues.
Theoretically
the

non-profit

NCBE
could
waive
those
in
the
public
interest,
but
it’s
also
a
non-profit
with

$175
million
in
assets

so
make
your
own
conclusions
about
how
that
would’ve
gone
down.

There’s
nothing
inherently
wrong
with
AI
helping
out
in
this
process.
If
the
examiners
have
confidence
in
the
subject
matter
experts
reviewing
the
final
product,
it
might
be
a
useful
brainstorming
tool.
Indeed,
the
AI
doesn’t
seem
as
concerning
as
who
allegedly
used
it.
Garbage
in,
garbage
out
is
a
real
problem,
and
whether
it’s
writing
or
vetting
the
questions,
having
a
non-lawyer
on
the
other
end
of
the
keyboard
raises
risks.
Some
of
the
professors
cited
in
the
Times
article
also
worry
that
it
raises
conflict
of
interest
issues
as
the
psychometrician
is
expected
to
vouch
for
the
reliability
of
the
questions
on
the
back
end,
though
the
bar
examiners
stress
that
this
process
isn’t
subjective.

Besides,
the
bar
examiners
were
told
to
consider
using
AI
by
no
less
than
the
California
Supreme
Court

who
oversees
the
process.
A
statement
that
prompted
the
California
Supreme
Court
to
immediately
respond
that
they
never
knew
anything
about
this
until
the
press
release
dropped.

WELL.
OILED.
MACHINE.

There
are
a
lot
of
risks
in
using
AI
generally
and
to
build
a
life-changing
exam
specifically.
Other
than
the
inherent
shadiness
involved
in
making
this
announcement
a
couple
months
after
the
fact
and
having
the
ultimate
authorities
at
the
State
Supreme
Court
reply,
“Whoa!
Don’t
rope
us
into
this
mess!”
there’s
reason
to
believe
the
bar
examiners
probably
used
AI
responsibly
here.

Still,
Katie
Moran,
an
associate
professor
at
the
University
of
San
Francisco
School
of
Law,
told
the
Times
that
the
State
Bar
should
“release
all
200
questions
that
were
on
the
test
for
transparency
and
to
allow
future
test
takers
a
chance
to
get
used
to
the
different
questions.”
That’s
the
sort
of
proactive
transparency
that
avoids
these
sorts
of
belated
bombshells.
It
also
puts
more
eyes
on
the
questions
and
helps
the
authors

the
human
ones

refine
questions
in
response
to
the
wisdom
of
the
subject
matter
expert
crowd.

“She
also
called
on
the
State
Bar
to
return
to
the
multi-state
bar
exam
for
the
July
exams.”
Yeah…
for
the
love
of
all
that’s
holy,
don’t
do
that.
Nothing
solved
by
more
whiplash,
there’s
almost
no
chance
the
examiners
could
lock
down
all
the
pricey
venues
they
would
need
by
July,
and

the
State
Bar
notes

one
of
the
few
things
applicants
hate
more
than
this
test
is
the
lack
of
a
remote
option.

Also,
the
NCBE
questions
consistently
generate
a
ton
of
applicant
complaints.
While
everyone
in
California

rightly

grumbles
about
these
new
questions,
just
wait
until
July
when
we
start
seeing
the
metric
shit
ton
of
rage
over
incoherent
multi-state
questions.
Eventually…
the
NCBE
bans
talking
about
the
substance
of
the
exam,
ostensibly
to
protect
the
sanctity
of
the
exam
for
later
test
takers
but
with
the
useful
side
effect
of
keeping
criticism
tamped
down
until
after
results
are
released.
TRANSPARENCY!

The
California
Bar
Exam
is
broken
in
a
lot
of
ways
and
people
should
be
angry
about
it.
But
without
a
time
machine
to
go
back
a
number
of
years
to
when
they
should’ve
started

planning

to
move
to
this
new
test
instead
of
slapping
it
together
in
a
matter
of
months,
it
is
what
it
is.
But
they’re
working
on
fixing
it,
even
if
this
was
a
borderline
irresponsible
move
to
start
this
in
February.

And
of
all
the
problems
we’ve
heard
throughout
this
process,
using
AI
to
develop
some
questions
doesn’t
make
the
top-tier.




HeadshotJoe
Patrice
 is
a
senior
editor
at
Above
the
Law
and
co-host
of

Thinking
Like
A
Lawyer
.
Feel
free
to email
any
tips,
questions,
or
comments.
Follow
him
on Twitter or

Bluesky

if
you’re
interested
in
law,
politics,
and
a
healthy
dose
of
college
sports
news.
Joe
also
serves
as
a

Managing
Director
at
RPN
Executive
Search
.