Selected Newsgroup Message

Dejanews Thread

From: "Sergio Navega" <snavega@ibm.net>
Subject: Re: Intelligent behavior.
Date: 30 Jan 1999 00:00:00 GMT
Message-ID: <36b31b30@news3.ibm.net>
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu>
X-Notice: should be reported to postmaster@ibm.net
X-MimeOLE: Produced By Microsoft MimeOLE V4.71.1712.3
X-Complaints-To: postmaster@ibm.net
X-Trace: 30 Jan 1999 14:46:08 GMT, 166.72.21.217
Organization: SilWis
Newsgroups: comp.ai.philosophy

Neil Rickert wrote in message <78tim8$erf@ux.cs.niu.edu>...
>andersw+@pitt.edu (Anders N Weinstein) writes:
>
>>Actually, this rather supports Neil's view, I think, that the
>>system can come to exploit its fixed receptors in quite novel ways.
>>Most actual handicapped people are not limited in the way *you*
>>hypothesize, to purely digital inputs.
>
>Again, Anders has expressed my position rather well.  Blind people,
>for example, acquire abilities to pick up information through tactile
>senses that most of us can only pickup through our eyes.  I worry
>that Daryl has crippled Fred to the extent that this sort of
>perceptual reorganization becomes impossible.
>

Sorry, I can't see this happening by improvement of the touch
sensors or by augmentation of the density in the blind's hand.
There's no evidence that this happens in human bodies.
It happens, so far as we know, by brain plasticity. That means
that the perceptual system (sensory transducers + "close" brain
processing) improves, but the sensory transducers are fixed,
because the latter were "designed" by evolution and immutable
during one's life.

No man can hear sounds with a frequency less than 20 or 15 hz.
The wind is constantly floating around us with 0.2 to 5 hz.
We did not hear it. Because that was not important, in
evolutionary terms. If our life depended on it, I bet we
would hear the wind in 500,000 years.

Regards,
Sergio Navega.

From: usurper@euronet.nl (TechnoCrate)
Subject: Re: Intelligent behavior.
Date: 30 Jan 1999 00:00:00 GMT
Message-ID: <36b43a48.944309@news.euronet.nl>
Content-Transfer-Encoding: 7bit
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net>
Content-Type: text/plain; charset=us-ascii
Organization: Bad Advice Department
Mime-Version: 1.0
Newsgroups: comp.ai.philosophy

On Sat, 30 Jan 1999 11:30:44 -0200, "Sergio Navega" <snavega@ibm.net>
wrote:

>Neil Rickert wrote in message <78tim8$erf@ux.cs.niu.edu>...
>>andersw+@pitt.edu (Anders N Weinstein) writes:
>>
>>>Actually, this rather supports Neil's view, I think, that the
>>>system can come to exploit its fixed receptors in quite novel ways.
>>>Most actual handicapped people are not limited in the way *you*
>>>hypothesize, to purely digital inputs.
>>
>>Again, Anders has expressed my position rather well.  Blind people,
>>for example, acquire abilities to pick up information through tactile
>>senses that most of us can only pickup through our eyes.  I worry
>>that Daryl has crippled Fred to the extent that this sort of
>>perceptual reorganization becomes impossible.
>>
>

It is said that the hands feel what the eyes see but this is only
partial true. There will be systems that are shared between sight and
touch but our tactile sensory is not able to mutate in a way to
receive new types of stimuli. For example: our tactile sensory is
unable to feel color or luminance, something our eyes are very capable
of. Shape perception and a number of other things (texture, location,
size) are shared by the tactile and visual sensory.

>Sorry, I can't see this happening by improvement of the touch
>sensors or by augmentation of the density in the blind's hand.
>There's no evidence that this happens in human bodies.
>It happens, so far as we know, by brain plasticity. That means
>that the perceptual system (sensory transducers + "close" brain
>processing) improves, but the sensory transducers are fixed,
>because the latter were "designed" by evolution and immutable
>during one's life.
>
Yes, very true. It's even believed that phantom pains are a result
(partly) of cannibalism in the brain. The regions of the brain that
are no longer used are taken over by neighbouring regions which can
result in an itch in an amputated arm while the real itch is in the
cheek for example (Ramachandran describes some peculiar cases much in
the way Sacks does). The cannibalizing region is ofcourse enhanced by
added computational power (if one can use computational in this
sense).

From: rickert@cs.niu.edu (Neil Rickert)
Subject: Re: Intelligent behavior.
Date: 30 Jan 1999 00:00:00 GMT
Message-ID: <7902qe$gk0@ux.cs.niu.edu>
References: <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net>
Organization: Northern Illinois University
Newsgroups: comp.ai.philosophy

"Sergio Navega" <snavega@ibm.net> writes:

>Neil Rickert wrote in message <78tim8$erf@ux.cs.niu.edu>...
>>andersw+@pitt.edu (Anders N Weinstein) writes:

>>>Actually, this rather supports Neil's view, I think, that the
>>>system can come to exploit its fixed receptors in quite novel ways.
>>>Most actual handicapped people are not limited in the way *you*
>>>hypothesize, to purely digital inputs.

>>Again, Anders has expressed my position rather well.  Blind people,
>>for example, acquire abilities to pick up information through tactile
>>senses that most of us can only pickup through our eyes.  I worry
>>that Daryl has crippled Fred to the extent that this sort of
>>perceptual reorganization becomes impossible.

>Sorry, I can't see this happening by improvement of the touch
>sensors or by augmentation of the density in the blind's hand.
>There's no evidence that this happens in human bodies.

Sorry for the confusion.  There is no suggesting of changing the
touch sensors or their density.  The claim is that there is
information already present in the output of those sensors that is
not normally used by a sighted person.  The brain of the blind person
can do some rewiring to make this accessible.  Digitization intended
to be suitable for the information needs of the sighted person may
lose this information.

From: "james d. hunter" <jim.hunter@jhuapl.edu>
Subject: Re: Intelligent behavior.
Date: 30 Jan 1999 00:00:00 GMT
Message-ID: <36B39A13.C902133C@jhuapl.edu>
Content-Transfer-Encoding: 7bit
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net>
Content-Type: text/plain; charset=us-ascii
X-Complaints-To: usenet@houston.jhuapl.edu
X-Trace: houston.jhuapl.edu 917740051 12689 128.244.27.28 (30 Jan 1999 23:47:31 GMT)
Organization: jhu/apl
Mime-Version: 1.0
Reply-To: jim.hunter@spam.free.jhuapl.edu.
NNTP-Posting-Date: 30 Jan 1999 23:47:31 GMT
Newsgroups: comp.ai.philosophy

Sergio Navega wrote:
>
> Neil Rickert wrote in message <78tim8$erf@ux.cs.niu.edu>...
> >andersw+@pitt.edu (Anders N Weinstein) writes:
> >
> >>Actually, this rather supports Neil's view, I think, that the
> >>system can come to exploit its fixed receptors in quite novel ways.
> >>Most actual handicapped people are not limited in the way *you*
> >>hypothesize, to purely digital inputs.
> >
> >Again, Anders has expressed my position rather well.  Blind people,
> >for example, acquire abilities to pick up information through
tactile
> >senses that most of us can only pickup through our eyes.  I worry
> >that Daryl has crippled Fred to the extent that this sort of
> >perceptual reorganization becomes impossible.
> >
>
> Sorry, I can't see this happening by improvement of the touch
> sensors or by augmentation of the density in the blind's hand.
> There's no evidence that this happens in human bodies.
> It happens, so far as we know, by brain plasticity. That means
> that the perceptual system (sensory transducers + "close" brain
> processing) improves, but the sensory transducers are fixed,
> because the latter were "designed" by evolution and immutable
> during one's life.
>
> No man can hear sounds with a frequency less than 20 or 15 hz.
> The wind is constantly floating around us with 0.2 to 5 hz.
> We did not hear it. Because that was not important, in
> evolutionary terms. If our life depended on it, I bet we
> would hear the wind in 500,000 years.

  Well that's true, but it is also maybe not too important,
  because like you said of the plasticity. Personally,
  I -can- hear noises below 5 Hz, because in my line of
  work lives do depend on it.

  ---
  Jim

From: "Sergio Navega" <snavega@ibm.net>
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <36b59c56@news3.ibm.net>
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net> <36B39A13.C902133C@jhuapl.edu>
X-Notice: should be reported to postmaster@ibm.net
X-MimeOLE: Produced By Microsoft MimeOLE V4.71.1712.3
X-Complaints-To: postmaster@ibm.net
X-Trace: 1 Feb 1999 12:21:42 GMT, 166.72.21.59
Organization: SilWis
Newsgroups: comp.ai.philosophy

james d. hunter wrote in message <36B39A13.C902133C@jhuapl.edu>...
>Sergio Navega wrote:
> >
> > Neil Rickert wrote in message <78tim8$erf@ux.cs.niu.edu>...
> > >andersw+@pitt.edu (Anders N Weinstein) writes:
> > >
> > >>Actually, this rather supports Neil's view, I think, that the
> > >>system can come to exploit its fixed receptors in quite novel ways.
> > >>Most actual handicapped people are not limited in the way *you*
> > >>hypothesize, to purely digital inputs.
> > >
> > >Again, Anders has expressed my position rather well.  Blind people,
> > >for example, acquire abilities to pick up information through
>tactile
> > >senses that most of us can only pickup through our eyes.  I worry
> > >that Daryl has crippled Fred to the extent that this sort of
> > >perceptual reorganization becomes impossible.
> > >
> >
> > Sorry, I can't see this happening by improvement of the touch
> > sensors or by augmentation of the density in the blind's hand.
> > There's no evidence that this happens in human bodies.
> > It happens, so far as we know, by brain plasticity. That means
> > that the perceptual system (sensory transducers + "close" brain
> > processing) improves, but the sensory transducers are fixed,
> > because the latter were "designed" by evolution and immutable
> > during one's life.
> >
> > No man can hear sounds with a frequency less than 20 or 15 hz.
> > The wind is constantly floating around us with 0.2 to 5 hz.
> > We did not hear it. Because that was not important, in
> > evolutionary terms. If our life depended on it, I bet we
> > would hear the wind in 500,000 years.
>
>  Well that's true, but it is also maybe not too important,
>  because like you said of the plasticity. Personally,
>  I -can- hear noises below 5 Hz, because in my line of
>  work lives do depend on it.
>

James, you left me curious!
What is your work and how you came to the conclusion that
you're able to hear (and not feel by touch which, as we know,
is able to detect that frequency range)?

Regards,
Sergio Navega.

From: "james d. hunter" <jim.hunter@jhuapl.edu>
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <36B5D080.C43E884C@jhuapl.edu>
Content-Transfer-Encoding: 7bit
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net> <36B39A13.C902133C@jhuapl.edu> <36b59c56@news3.ibm.net>
Content-Type: text/plain; charset=us-ascii
X-Complaints-To: usenet@houston.jhuapl.edu
X-Trace: houston.jhuapl.edu 917885056 2870 128.244.27.28 (1 Feb 1999 16:04:16 GMT)
Organization: jhu/apl
Mime-Version: 1.0
Reply-To: jim.hunter@spam.free.jhuapl.edu.
NNTP-Posting-Date: 1 Feb 1999 16:04:16 GMT
Newsgroups: comp.ai.philosophy

Sergio Navega wrote:
>

[...]

> > >
> > > No man can hear sounds with a frequency less than 20 or 15 hz.
> > > The wind is constantly floating around us with 0.2 to 5 hz.
> > > We did not hear it. Because that was not important, in
> > > evolutionary terms. If our life depended on it, I bet we
> > > would hear the wind in 500,000 years.
> >
> >  Well that's true, but it is also maybe not too important,
> >  because like you said of the plasticity. Personally,
> >  I -can- hear noises below 5 Hz, because in my line of
> >  work lives do depend on it.
> >
>
> James, you left me curious!
> What is your work and how you came to the conclusion that
> you're able to hear (and not feel by touch which, as we know,
> is able to detect that frequency range)?

  My work roams around really. It's mostly information and signal
  processing. Hearing noises that are outside of the normal human
  hearing range is a old trick involving frequency translators.
  You can filter sound, and translate the 0.2-5 Hz interval up to
  around 800 Hz, where you can hear it just fine.

  ---
  Jim

From: "Sergio Navega" <snavega@ibm.net>
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <36b5e49f@news3.ibm.net>
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net> <36B39A13.C902133C@jhuapl.edu> <36b59c56@news3.ibm.net> <36B5D080.C43E884C@jhuapl.edu>
X-Notice: should be reported to postmaster@ibm.net
X-MimeOLE: Produced By Microsoft MimeOLE V4.71.1712.3
X-Complaints-To: postmaster@ibm.net
X-Trace: 1 Feb 1999 17:30:07 GMT, 166.72.21.110
Organization: SilWis
Newsgroups: comp.ai.philosophy

james d. hunter wrote in message <36B5D080.C43E884C@jhuapl.edu>...
>Sergio Navega wrote:
>>
>
>[...]
>
> > > >
> > > > No man can hear sounds with a frequency less than 20 or 15 hz.
> > > > The wind is constantly floating around us with 0.2 to 5 hz.
> > > > We did not hear it. Because that was not important, in
> > > > evolutionary terms. If our life depended on it, I bet we
> > > > would hear the wind in 500,000 years.
> > >
> > >  Well that's true, but it is also maybe not too important,
> > >  because like you said of the plasticity. Personally,
> > >  I -can- hear noises below 5 Hz, because in my line of
> > >  work lives do depend on it.
> > >
> >
> > James, you left me curious!
> > What is your work and how you came to the conclusion that
> > you're able to hear (and not feel by touch which, as we know,
> > is able to detect that frequency range)?
>
>
>  My work roams around really. It's mostly information and signal
>  processing. Hearing noises that are outside of the normal human
>  hearing range is a old trick involving frequency translators.
>  You can filter sound, and translate the 0.2-5 Hz interval up to
>  around 800 Hz, where you can hear it just fine.
>

That's a trick using instruments! It is like passing your voice
through those pitch changers and talking like a child or a woman.
I thought we were talking about our native capacities.

Regards,
Sergio Navega.

From: "james d. hunter" <jim.hunter@jhuapl.edu>
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <36B5E9C4.11FA6212@jhuapl.edu>
Content-Transfer-Encoding: 7bit
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net> <36B39A13.C902133C@jhuapl.edu> <36b59c56@news3.ibm.net> <36B5D080.C43E884C@jhuapl.edu> <36b5e49f@news3.ibm.net>
Content-Type: text/plain; charset=us-ascii
X-Complaints-To: usenet@houston.jhuapl.edu
X-Trace: houston.jhuapl.edu 917891523 10532 128.244.27.28 (1 Feb 1999 17:52:03 GMT)
Organization: jhu/apl
Mime-Version: 1.0
Reply-To: jim.hunter@spam.free.jhuapl.edu.
NNTP-Posting-Date: 1 Feb 1999 17:52:03 GMT
Newsgroups: comp.ai.philosophy

Sergio Navega wrote:
>
> james d. hunter wrote in message <36B5D080.C43E884C@jhuapl.edu>...
> >Sergio Navega wrote:
> >>
> >
> >[...]
> >
> > > > >
> > > > > No man can hear sounds with a frequency less than 20 or 15 hz.
> > > > > The wind is constantly floating around us with 0.2 to 5 hz.
> > > > > We did not hear it. Because that was not important, in
> > > > > evolutionary terms. If our life depended on it, I bet we
> > > > > would hear the wind in 500,000 years.
> > > >
> > > >  Well that's true, but it is also maybe not too important,
> > > >  because like you said of the plasticity. Personally,
> > > >  I -can- hear noises below 5 Hz, because in my line of
> > > >  work lives do depend on it.
> > > >
> > >
> > > James, you left me curious!
> > > What is your work and how you came to the conclusion that
> > > you're able to hear (and not feel by touch which, as we know,
> > > is able to detect that frequency range)?
> >
> >
> >  My work roams around really. It's mostly information and signal
> >  processing. Hearing noises that are outside of the normal human
> >  hearing range is a old trick involving frequency translators.
> >  You can filter sound, and translate the 0.2-5 Hz interval up to
> >  around 800 Hz, where you can hear it just fine.
> >
>
> That's a trick using instruments! It is like passing your voice
> through those pitch changers and talking like a child or a woman.
> I thought we were talking about our native capacities.

  I think we are sticking to native capacities.
  Eyeglasses are instruments, but I think eyeglasses
  aren't to be confused with surgical modifications
  of the brain, or long term evolutionary changes.
  If you want to get really technical; pens, pencils,
  and paper are human instruments.

  ---
  Jim

From: Jim Balter <jqb@sandpiper.net>
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <36B68155.42ACEF32@sandpiper.net>
Content-Transfer-Encoding: 7bit
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net> <36B39A13.C902133C@jhuapl.edu> <36b59c56@news3.ibm.net> <36B5D080.C43E884C@jhuapl.edu> <36b5e49f@news3.ibm.net>
X-Accept-Language: en-US
Content-Type: text/plain; charset=us-ascii
Organization: Sandpiper Networks, Inc.
Mime-Version: 1.0
Newsgroups: comp.ai.philosophy

Sergio Navega wrote:
>
> james d. hunter wrote in message <36B5D080.C43E884C@jhuapl.edu>...
> >Sergio Navega wrote:
> >>
> >
> >[...]
> >
> > > > >
> > > > > No man can hear sounds with a frequency less than 20 or 15 hz.
> > > > > The wind is constantly floating around us with 0.2 to 5 hz.
> > > > > We did not hear it. Because that was not important, in
> > > > > evolutionary terms. If our life depended on it, I bet we
> > > > > would hear the wind in 500,000 years.
> > > >
> > > >  Well that's true, but it is also maybe not too important,
> > > >  because like you said of the plasticity. Personally,
> > > >  I -can- hear noises below 5 Hz, because in my line of
> > > >  work lives do depend on it.
> > > >
> > >
> > > James, you left me curious!
> > > What is your work and how you came to the conclusion that
> > > you're able to hear (and not feel by touch which, as we know,
> > > is able to detect that frequency range)?
> >
> >
> >  My work roams around really. It's mostly information and signal
> >  processing. Hearing noises that are outside of the normal human
> >  hearing range is a old trick involving frequency translators.
> >  You can filter sound, and translate the 0.2-5 Hz interval up to
> >  around 800 Hz, where you can hear it just fine.
> >
>
> That's a trick using instruments! It is like passing your voice
> through those pitch changers and talking like a child or a woman.
> I thought we were talking about our native capacities.

Surely the ability to build instruments is a native human capacity! 
*Your* point was about evolutionary changes would come about
to allow us to, say, hear sounds below 5 Hz if our lives depended
upon it.  Homes, clothing, spears, plows, and, for some,
frequency translators, are life-saving instruments.

--
<J Q B>

From: "Sergio Navega" <snavega@ibm.net>
Subject: Re: Intelligent behavior.
Date: 02 Feb 1999 00:00:00 GMT
Message-ID: <36b73af7@news3.ibm.net>
References: <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tim8$erf@ux.cs.niu.edu> <36b31b30@news3.ibm.net> <36B39A13.C902133C@jhuapl.edu> <36b59c56@news3.ibm.net> <36B5D080.C43E884C@jhuapl.edu> <36b5e49f@news3.ibm.net> <36B68155.42ACEF32@sandpiper.net>
X-Notice: should be reported to postmaster@ibm.net
X-MimeOLE: Produced By Microsoft MimeOLE V4.71.1712.3
X-Complaints-To: postmaster@ibm.net
X-Trace: 2 Feb 1999 17:50:47 GMT, 166.72.21.130
Organization: SilWis
Newsgroups: comp.ai.philosophy

Jim Balter wrote in message <36B68155.42ACEF32@sandpiper.net>...
>Sergio Navega wrote:
>>
>> james d. hunter wrote in message <36B5D080.C43E884C@jhuapl.edu>...
>> >Sergio Navega wrote:
>> >>
>> >
>> >[...]
>> >
>> > > > >
>> > > > > No man can hear sounds with a frequency less than 20 or 15 hz.
>> > > > > The wind is constantly floating around us with 0.2 to 5 hz.
>> > > > > We did not hear it. Because that was not important, in
>> > > > > evolutionary terms. If our life depended on it, I bet we
>> > > > > would hear the wind in 500,000 years.
>> > > >
>> > > >  Well that's true, but it is also maybe not too important,
>> > > >  because like you said of the plasticity. Personally,
>> > > >  I -can- hear noises below 5 Hz, because in my line of
>> > > >  work lives do depend on it.
>> > > >
>> > >
>> > > James, you left me curious!
>> > > What is your work and how you came to the conclusion that
>> > > you're able to hear (and not feel by touch which, as we know,
>> > > is able to detect that frequency range)?
>> >
>> >
>> >  My work roams around really. It's mostly information and signal
>> >  processing. Hearing noises that are outside of the normal human
>> >  hearing range is a old trick involving frequency translators.
>> >  You can filter sound, and translate the 0.2-5 Hz interval up to
>> >  around 800 Hz, where you can hear it just fine.
>> >
>>
>> That's a trick using instruments! It is like passing your voice
>> through those pitch changers and talking like a child or a woman.
>> I thought we were talking about our native capacities.
>
>Surely the ability to build instruments is a native human capacity!
>*Your* point was about evolutionary changes would come about
>to allow us to, say, hear sounds below 5 Hz if our lives depended
>upon it.  Homes, clothing, spears, plows, and, for some,
>frequency translators, are life-saving instruments.
>

That's a good example of how newsgroup discussions lose completely
the point. We may have a native ability to build instruments, but
we do not have a native capacity to hear sounds below 20 Hz. The
subtle difference may be unnoticeable even to "eagle-eyed" posters
such as Jim Balter.

When we design an instrument to allow us to listen to 5 Hz and
below, we do it usually by, as James said, converting that
frequency to another, within our hearing spectrum. This is way
*different* than being able to natively hear 5 Hz. On the former
case, we relocate one frequency to another, on the latter we get
that frequency *in addition* to what we usually listen (20 to 20Khz
approx). This can be the difference between life and death.

Those of you who remember Arnold Schwartzenegger (I'm sure I
mispelled it) movie "The Predator" will remember that the
alien had one "advantage" over the humans: it could see in
infrared. But that was his *whole* visual spectrum. That
gave the alien an advantage on dark forests, because you
can "see" the heat emanating from the human bodies. However,
when big Arnold perceived this detail, he covered his body with
cold mud and the alien, even being in front of him, could not
see it. That was because his infrared vision was not *in addition*
to normal sight. That's similar to what happens with those military
"night viewers". You gain infrared but *lose* conventional
frequencies, because you use those frequencies to receive the
converted material. That's also what happens with the instrument
of James.

The only way to solve this giving complete advantages
would be to increase the frequency spectrum, and that is not
an easy thing: it demands profound alterations not only in our
sensory mechanisms but also in the number of neurons that
receive and process the signals. Nature itself should spend
hundreds of thousands of years to do something similar, but
only if that represented an evolutionary advantage. That was
pretty much my original argument, before being eaten up by
the c.a.p. threads.

Regards,
Sergio Navega.

From: "Sergio Navega" <snavega@ibm.net>
Subject: Re: Intelligent behavior.
Date: 30 Jan 1999 00:00:00 GMT
Message-ID: <36b31b35@news3.ibm.net>
References: <78rdu8$c71@ux.cs.niu.edu> <78tbjm$c5m@edrn.newsguy.com> <78tght$29g$1@usenet01.srv.cis.pitt.edu> <78tiro$p8f@edrn.newsguy.com> <78toib$2nr$1@usenet01.srv.cis.pitt.edu>
X-Notice: should be reported to postmaster@ibm.net
X-MimeOLE: Produced By Microsoft MimeOLE V4.71.1712.3
X-Complaints-To: postmaster@ibm.net
X-Trace: 30 Jan 1999 14:46:13 GMT, 166.72.21.217
Organization: SilWis
Newsgroups: comp.ai.philosophy

Anders N Weinstein wrote in message
<78toib$2nr$1@usenet01.srv.cis.pitt.edu>...
>In article <78tiro$p8f@edrn.newsguy.com>,
>Daryl McCullough <daryl@cogentex.com> wrote:
>>andersw+@pitt.edu says...
>>
>>>In the back of my mind: For JJ Gibson, e.g. the stimulus for a perceptual
>>>system is a higher-order pattern in a signal. By "higher-order" I
>>>just mean that you might specify it as some function of the physical
>>>signal, e.g. the property of having such and such a derivative.
>>
>>As I said, it is clear that we can recognize faces on a computer
>>screen, even though the images are pixelated, not continuous. It
>
>But I don't see the relevance of this to the issue. We who have more
>discrminating and flexible perceptual systems can recognize patterns
>deriving from in reduced stimuli. That does not show the reverse is
>true, that someone who only got reduced stimuli could always do what we
>can do.
>
>>is clear that we can recognize songs reproduced on a CD player,
>>even though we only get a discretized version of the original
>>song. Obviously, we don't require that sensory data be continuous.
>>You can approximate derivatives by finite-differences, which appear
>>to be good enough.
>
>Here at last you are on the road to a mathematical argument, at least
>with regard to approximations to continuous functions by finite
>difference approximations. Neil had a response to it: viz, that any
>particular approximation by finite differences will lose something and
>that that something could be relevant to human discrimination.
>
>I.e. any digitization will lose something, but human perceptual systems
>might come to "amplify" that something to the point where it makes a
>difference by digitizing in a different way.
>

This is the central point of the argument and the one which I don't
agree, just because of lack of neurobiological evidence. Insisting
on this is the same as "fictioning" what is really happening within
the human body.

Do Anders and Neil claim that when a person gets blind, the explanation
for the better auditory perception that happens following this is due
to an improvement in its eardrum and related auditory cells?
I never read about this.

What I read is that the *brain* of the blind (*outside* the sensory
transducer, already "digitized") rewires itself (uses more neurons
and synapses) to improve its "digital" discriminating abilities.
This is equivalent to augment the whole perceptual system, but
*without* affecting the precision or accuracy of the sensory
transducers, that remain the same.

Under this view, a robot with a fixed ear and with a very large
digital accuracy could be able to present the same capability as
that of human, *provided that it is equiped with a similarly
"rewirable" brain*. What is important is brain rewiring (in
functional terms!) not alteration/adaptation of sensory transducers.
The latter is the business of evolution, as I said in another
post that, apparently, didn't make its arguments clear enough.

>>I think that the same kinds of high-level patterns can be
>>perceived in digitized data, as well.
>
>That may be true, but it does not seem to address Neil's claim, that
>whenever you assume a fixed digitization, something is lost, and that
>that something that could vary its digitization could detect it.

I very much doubt this happens in humans, because there's no
significant evidence (other than a recent article in Nature) about
plasticity in sensory transducers. You can't improve the digitization
of a transducer without altering its physical operation. That's not
what appears to be happening within our body. Only brains are
subject to plasticity.

Regards,
Sergio Navega.

From: rickert@cs.niu.edu (Neil Rickert)
Subject: Re: Intelligent behavior.
Date: 30 Jan 1999 00:00:00 GMT
Message-ID: <7902hb$giu@ux.cs.niu.edu>
References: <78toib$2nr$1@usenet01.srv.cis.pitt.edu> <36b31b35@news3.ibm.net>
Organization: Northern Illinois University
Newsgroups: comp.ai.philosophy

"Sergio Navega" <snavega@ibm.net> writes:
>Anders N Weinstein wrote in message
><78toib$2nr$1@usenet01.srv.cis.pitt.edu>...

>>I.e. any digitization will lose something, but human perceptual systems
>>might come to "amplify" that something to the point where it makes a
>>difference by digitizing in a different way.

>This is the central point of the argument and the one which I don't
>agree, just because of lack of neurobiological evidence. Insisting
>on this is the same as "fictioning" what is really happening within
>the human body.

>Do Anders and Neil claim that when a person gets blind, the explanation
>for the better auditory perception that happens following this is due
>to an improvement in its eardrum and related auditory cells?
>I never read about this.

No, we are not making any claim about changing sensory cells.
Rather, it is a matter of finding better ways of extracting
information from the output of sensory cells.  For example,
congenitally blind people are said to be able to recognize the
presence of nearby objects because of sound echos.  This sounds like
a weak version of the bat's echolocation.  The information for this
might be in the difference in arrival time of sound signals at two
different sensors.  It might be that the brain can only measure
arrival times to within a few milliseconds, yet can distinguish a
difference in arrival times of a few microseconds.  Something like
this is said to be the case with echolocation by bats.  If the signal
is digitized at a millisecond rate, the digital output would not
allow detection of a 10 microsecond difference in arrival time.

The idea is that there is subtle information already present in the
analog output of sensory cells that might be lost in digitization,
and that the brain can rewire itself so as to be able to use that
information.

From: "Sergio Navega" <snavega@ibm.net>
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <36b59c59@news3.ibm.net>
References: <78toib$2nr$1@usenet01.srv.cis.pitt.edu> <36b31b35@news3.ibm.net> <7902hb$giu@ux.cs.niu.edu>
X-Notice: should be reported to postmaster@ibm.net
X-MimeOLE: Produced By Microsoft MimeOLE V4.71.1712.3
X-Complaints-To: postmaster@ibm.net
X-Trace: 1 Feb 1999 12:21:45 GMT, 166.72.21.59
Organization: SilWis
Newsgroups: comp.ai.philosophy

Neil Rickert wrote in message <7902hb$giu@ux.cs.niu.edu>...
>"Sergio Navega" <snavega@ibm.net> writes:
>>Anders N Weinstein wrote in message
>><78toib$2nr$1@usenet01.srv.cis.pitt.edu>...
>
>>>I.e. any digitization will lose something, but human perceptual systems
>>>might come to "amplify" that something to the point where it makes a
>>>difference by digitizing in a different way.
>
>>This is the central point of the argument and the one which I don't
>>agree, just because of lack of neurobiological evidence. Insisting
>>on this is the same as "fictioning" what is really happening within
>>the human body.
>
>>Do Anders and Neil claim that when a person gets blind, the explanation
>>for the better auditory perception that happens following this is due
>>to an improvement in its eardrum and related auditory cells?
>>I never read about this.
>
>No, we are not making any claim about changing sensory cells.
>Rather, it is a matter of finding better ways of extracting
>information from the output of sensory cells.  For example,
>congenitally blind people are said to be able to recognize the
>presence of nearby objects because of sound echos.  This sounds like
>a weak version of the bat's echolocation.  The information for this
>might be in the difference in arrival time of sound signals at two
>different sensors.  It might be that the brain can only measure
>arrival times to within a few milliseconds, yet can distinguish a
>difference in arrival times of a few microseconds.  Something like
>this is said to be the case with echolocation by bats.  If the signal
>is digitized at a millisecond rate, the digital output would not
>allow detection of a 10 microsecond difference in arrival time.
>

Ok, this is really important and I saw some articles dealing with
this sort of precision. In particular, there's a conjecture that
we localize the vertical position of a sound source by a very
unusual mechanism. Suppose you close your eyes and tell somebody
to put a sound source to your exact left side, in an unknown
height. You should be able to tell the height of the sound in
several situations, even if those situations present the same
information to the right ear (which is used to discriminate
changes in the horizontal plane). Apparently that happens because
of the ondulation of our ears. When the sound source is above
the ear, those ondulations will provoke minor resonances that
affect minimally what we hear. When the source is below the ear,
another set of resonances will form and the sound we hear will
be different in another way. To perceive that kind of minuscule
differences we really must have a very accurate perceptual
system. It is amazing.

>The idea is that there is subtle information already present in the
>analog output of sensory cells that might be lost in digitization,
>and that the brain can rewire itself so as to be able to use that
>information.
>

This is the core of my doubt and, granted, I need further reading
to tell what really happens. To the extent of what I know, the
outputs of the ear already have, in thousands of different outputs,
that difference already accounted (and digitized). It is up to
the brain to rewire itself to be sensitive to the information
already present in those outputs.

Regards,
Sergio Navega.

From: andersw+@pitt.edu (Anders N Weinstein)
Subject: Re: Intelligent behavior.
Date: 31 Jan 1999 00:00:00 GMT
Message-ID: <7921el$c3n$1@usenet01.srv.cis.pitt.edu>
References: <78rdu8$c71@ux.cs.niu.edu> <78tiro$p8f@edrn.newsguy.com> <78toib$2nr$1@usenet01.srv.cis.pitt.edu> <36b31b35@news3.ibm.net>
Organization: University of Pittsburgh
Newsgroups: comp.ai.philosophy

In article <36b31b35@news3.ibm.net>, Sergio Navega <snavega@ibm.net> wrote:
>Anders N Weinstein wrote in message
><78toib$2nr$1@usenet01.srv.cis.pitt.edu>...
>>>
>>>>In the back of my mind: For JJ Gibson, e.g. the stimulus for a perceptual
>>>>system is a higher-order pattern in a signal. By "higher-order" I
>>>>just mean that you might specify it as some function of the physical
>>>>signal, e.g. the property of having such and such a derivative.

>Do Anders and Neil claim that when a person gets blind, the explanation
>for the better auditory perception that happens following this is due
>to an improvement in its eardrum and related auditory cells?
>I never read about this.

Neil is explicitly not claiming this, and I have been trying to help
expound Neil's view as I understand it. I was drawing on my own background
study of JJ Gibson, with whose position Neil's has some affinities (but
also some differences, so one shouldn't simply class Neil among the
"Gibsonians").

>What I read is that the *brain* of the blind (*outside* the sensory
>transducer, already "digitized") rewires itself (uses more neurons
>and synapses) to improve its "digital" discriminating abilities.
>This is equivalent to augment the whole perceptual system, but

It is improving its abilities to discrminate relevant aspects of
the raw signal. Those aspects might be "analog" in nature; e.g.
they might consist in certain dynamical properties and not others.
In Neil's terms this might mean changing its A/D converters or
changing the way the higher-level circuits digitize the analog signal.

>*without* affecting the precision or accuracy of the sensory
>transducers, that remain the same.
>

Right. But what you get from the "sensory transducers" is a raw analog
signal (think of the EM fields in antenna in a tuned radio set). It is
only higher-level processing that "resonates" (selectively responds) to
or works to isolate and extract some relevant aspect or feature of this
raw signal. (Think of what the adjustable tuner or color signal separation
circuitry does).

>digital accuracy could be able to present the same capability as
>that of human, *provided that it is equiped with a similarly
>"rewirable" brain*. What is important is brain rewiring (in
>functional terms!) not alteration/adaptation of sensory transducers.
>The latter is the business of evolution, as I said in another
>post that, apparently, didn't make its arguments clear enough.
>
>>>I think that the same kinds of high-level patterns can be
>>>perceived in digitized data, as well.
>>
>>That may be true, but it does not seem to address Neil's claim, that
>>whenever you assume a fixed digitization, something is lost, and that
>>that something that could vary its digitization could detect it.
>
>I very much doubt this happens in humans, because there's no
>significant evidence (other than a recent article in Nature) about
>plasticity in sensory transducers. You can't improve the digitization
>of a transducer without altering its physical operation. That's not
>what appears to be happening within our body. Only brains are
>subject to plasticity.

Aha, you are thinking that the sensory transducers do digitization.
But that is not part of the view. Possibly the concept of "transducer"
is ambiguous. As engineers use the term, a microphone or telephone mouthpiece
transduces acoustic signals into electric signals, but does not digitize
them.

So there is no reason (what you call) the sensory transducers could not
function by passing an analog signal, perhaps subject to some analog
transformations, e.g. to transfer it into coding by spike frequencies,
or to do a kind of Fourier analysis as done presumably by the cochlear
hair cells. (Which NB is not yet "digitization".)

Digitization as Neil was using the term is what the higher-level
processing does to a raw physical signal.

BTW Gibson did not employ any concept like "digitization".  It is true
an articulate person can issue a verbal description of what they see
using the discrete symbols of a human language -- so perhaps you could
say the whole articulate person is one big "digitizer".

From: rickert@cs.niu.edu (Neil Rickert)
Subject: Re: Intelligent behavior.
Date: 01 Feb 1999 00:00:00 GMT
Message-ID: <795rmh$ca5@ux.cs.niu.edu>
References: <36b31b35@news3.ibm.net> <7921el$c3n$1@usenet01.srv.cis.pitt.edu>
Organization: Northern Illinois University
Newsgroups: comp.ai.philosophy

andersw+@pitt.edu (Anders N Weinstein) writes:

>Digitization as Neil was using the term is what the higher-level
>processing does to a raw physical signal.

Mainly I was using the term because it is so commonly used in AI
discussions.

>BTW Gibson did not employ any concept like "digitization".

Gibson was probably correct in not using it.  I have mixed feelings
about the appropriateness of the term when describing what happens in
the brain.  It is an aspect of digitization that one divides signals
into discrete units.  But categorization also puts things into
discrete units.  With digitization, the units are essentially
arbitrary, in the sense that they have little semantic significance.
With categorization, one tends to divide into units at natural
boundaries, so there that there is considerably more semantic
significance in the discrete units.  Perhaps one should use the term
'discretization' when one does not want to distinguish between
digitization and categorization.

 


Back to Menu of Messages               Sergio's Homepage

Any Comments? snavega@attglobal.net