The law firm of choice for internationally focused companies

+263 242 744 677

admin@tsazim.com

4 Gunhill Avenue,

Harare, Zimbabwe

Autocorrect, Other AI Applications, Are Biased Against Rural Language Like Hunting And Fishing Terms – Above the Law

I
spent
about
half
my
life
in
very
rural
areas
of
America
learning
how
to
hunt
and
fish
and
otherwise
have
a
good
time
in
the
outdoors.
For
the
other
half,
of
course,
I’ve
been
in
cities
of
various
sizes
learnin’
fancy
lawyer
words
and
then
trying
to
make
some
money
with
them.

I
had
some
success
along
each
of
these
paths.
Which
pretty
much
makes
me
too
much
of
a
redneck
to
fit
in
well
in
big
cities
while
simultaneously
making
me
too
much
of
an
over-educated
liberal
to
really
feel
at
home
out
in
the
country.

That
problem
aside,
my
somewhat
unique
cultural
perspective
can
also
be
an
asset.
Like
when
I
notice
weird
things
happening
with
the
autocorrect
features
on
various
applications
on
my

Apple

iPhone.

I
was
texting
my
cousin
this
morning
using

Facebook
Messenger
.
I
mentioned
“snaring
rabbits.”
Despite
having
spelled
that
perfectly
the
first
time,
Facebook
Messenger’s
autocorrect
feature
changed
my
carefully
selected
phrase
to
“sharing
rabbits,”
which
totally
makes
more
sense
I
guess,
what
with
Easter
coming
up
and
all.

This
was
not
an
isolated
incident.
I
have
a
friend
who
lives
in
Missouri.
I
text
him
from
time
to
time
to
ask
about
his
level
of
success
in
“gigging”
frogs
(the
best
definition
of
a
“gig”
that
I’ve
found
is
“a

harpoonlike
device
,”
so
when
you
are
gigging
frogs,
you
are
spearing
them,
to
make
delicious
frog
legs).
My
iPhone
sure
likes
to
automatically
change
that
to
“gagging”
frogs.

There
is
certainly
a
discussion
to
be
had
about
humane
methods
of
what
is
legally
referred
to
in
many
jurisdictions
as
animal
take.”
I
just
don’t
know
why
my
iPhone
seems
intent
on
manual
strangulation
as
opposed
to
a
quick
and
neat
spear
thrust.

I
did
not
readily
find
academic
research
about
artificial
intelligence
applications
potentially
being
biased
against
more
rural
terminology
(there
is
a
whole
bunch
of
research
out
there

about
AI
being
racist
,
however).
Anecdotally,
I
have
noticed
this
repeatedly,
and
it
kind
of
makes
sense
logically
based
on
how
text-based
AI
features
function.

What
an
autocorrect
feature
(or
a
large
language
model
asked
to
write
something)
is
doing,
after
having
read
through
all
sorts
of
content
similar
to
what
is
currently
being
generated,
is
predicting
what
the
human
writer
is
most
likely
to
want
to
say.
Only

about
20%
of
Americans

live
in
rural
areas.
That
makes
for
a
lot
more
people
texting,
messaging,
and
otherwise
writing
about
things
going
on
in
cities,
suburbs,
and
medium-sized
communities.
AIs
are
training
disproportionately
on
content
created
by
people
living
in
higher-population
areas.

For
the
record,
I
did
ask
the
latest
version
of

ChatGPT
,
“Do
you
know
anything
about
gigging
frogs?”
It
totally
nailed
the
correct
answer.
It
also
asked
me
why
I
was
I
interested
in
gigging
frogs,
so
I
explained
that
I
was
trying
to
ascertain
whether
it
was
familiar
with
terms
more
commonly
used
in
rural
America.
It
assured
me
it
was.

While
I
was
at
it,
I
actually
got
kind
of
wrapped
up
in
a
whole
conversation
with
the
thing
about
this
entire
article.
ChatGPT
more
or
less
agreed
with
me
about
how
less-sophisticated
AI
used
in
autocorrect
features
works
and
thus
how
it
often
fails
to
correctly
recognize
niche
terminology
that
may
be
more
prevalent
in
rural
America
as
it
tries
to
serve
the
broadest
group
of
users
possible.
The
whole
thing
was
equal
parts
impressive
and
uncanny.

Maybe
all
we’ve
learned
here
is
that
the
AI
in
use
to
correct
our
grammar
in
iPhones
and
on
Facebook
is
quite
primitive
in
comparison
to
OpenAI’s

perhaps
appropriately
hyped

platform.
But
if
you
ask
me
(or
ChatGPT),
texting
and
messaging
applications
currently
in
use
certainly
seem
to
be
biased
against
language,
like
niche
hunting
and
fishing
terms,
used
primarily
among
smaller
groups
of
people.

There
are
obviously
greater
injustices
in
the
world.
I
can
tell
you,
though,
that
messaging
applications
correcting
my
already
correct
text
messages
about
rural
things
is
an
almost
weekly
annoyance,
and
one
that
makes
me
feel
just
a
little
bit
more
resentful
toward
big
tech
companies
every
time
it
happens.
That
can’t
be
a
good
thing
for
America’s
rural-urban
political
divide.




Jonathan
Wolf
is
a
civil
litigator
and
author
of 
Your
Debt-Free
JD



(affiliate
link).
He
has
taught
legal
writing,
written
for
a
wide
variety
of
publications,
and
made
it
both
his
business
and
his
pleasure
to
be
financially
and
scientifically
literate.
Any
views
he
expresses
are
probably
pure
gold,
but
are
nonetheless
solely
his
own
and
should
not
be
attributed
to
any
organization
with
which
he
is
affiliated.
He
wouldn’t
want
to
share
the
credit
anyway.
He
can
be
reached
at 
[email protected].