A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Homebuilt PC's
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Intel details future Larrabee graphics chip



 
 
Thread Tools Display Modes
  #111  
Old August 14th 08, 12:13 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Wilco Dijkstra[_2_]
external usenet poster
 
Posts: 49
Default Intel details future Larrabee graphics chip


"Martin Brown" wrote in message ...

The worst pointer related faults I have ever had to find was as an outsider diagnosing faults in a customers large
software base. The crucial mode of failure was a local copy of a pointer to an object that was subsequently
deallocated but stayed around unmolested for long enough for the program to mostly still work except when it didn't.


Those are very nasty indeed. However they aren't strictly pointer related -
a language without pointers suffers from the same issue (even garbage
collection doesn't solve this kind of problem). ValGrind is good at finding
issues like this. Automatic checking tools have improved significantly
over the last 10 years.

The worst problem I've seen is a union of a pointer and an integer which
was used as a set of booleans, which were confused by the code. So the
last few bits of the pointer were sometimes being changed by setting or
clearing the booleans. Similarly the value of the booleans were different
on different systems or if you changed command-line options, compiled
for debug etc.

Wilco


  #112  
Old August 14th 08, 01:46 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Terje Mathisen
external usenet poster
 
Posts: 39
Default Intel details future Larrabee graphics chip

Wilco Dijkstra wrote:
"Terje Mathisen" wrote in message ...
How many ways can you define such a function?

The only serious alternatives would be in the handling of negative-or-zero inputs or when rounding the actual fp
result to integer:

Do you want the Floor(), i.e. truncate, Ceil() or Round_to_nearest_or_even()?

Using the latest alternative could make it harder to come up with a perfect implementation, but otherwise it should be
trivial.


It was a trivial routine, just floor(log2(x)), so just finding the top bit that is set.
The mistakes were things like not handling zero, using signed rather than
unsigned variables, looping forever for some inputs, returning the floor result + 1.

Rather than just shifting the value right until it becomes zero, it created a mask
and shifted it left until it was *larger* than the input (which is not going to work
if you use a signed variable for it or if the input has bit 31 set etc).

My version was something like:

int log2_floor(unsigned x)
{
int n = -1;
for ( ; x != 0; x = 1)
n++;
return n;
}


BG

That is _identical_ to the code I originally wrote as part of my post,
but then deleted as it didn't really add to my argument. :-)

There are of course many possible alternative methods, including inline
asm to use a hardware bitscan opcode.

Here's a possibly faster version:

int log2_floor(unsigned x)
{
int n = -1;
while (x = 0x10000) {
n += 16;
x = 16;
}
if (x = 0x100) {
n += 8;
x = 8;
}
if (x = 0x10) {
n += 4;
x = 4;
}
/* At this point x has been reduced to the 0-15 range, use a
* register-internal lookup table:
*/
uint32_t lookup_table = 0xffffaa50;
int lookup = (int) (lookup_table (x+x)) & 3;

return n + lookup;
}

or to make it branchless:

int log2_floor(unsigned x)
{
int n = -1;
int gt = (x = 0x10000) 4; // 0 or 16
n += gt;
x = gt;

gt = (x = 0x100) 3;
n += gt;
x = gt;

gt = (x = 0x10) 2;
n += gt;
x = gt;

uint32_t lookup_table = 0xffffaa50; // 0011222233333333
int lookup = (int) (lookup_table (x+x)) & 3;

return n + lookup;
}

Terje
--
-
"almost all programming can be viewed as an exercise in caching"
  #113  
Old August 14th 08, 02:10 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Nick Maclaren
external usenet poster
 
Posts: 72
Default Intel details future Larrabee graphics chip


In article ,
"Wilco Dijkstra" writes:
|
| I'd certainly be interested in the document. My email is above, just make
| the obvious edit.

Sent.

| | I bet that most code will compile and run without too much trouble.
| | C doesn't allow that much variation in targets. And the variation it
| | does allow (eg. one-complement) is not something sane CPU
| | designers would consider nowadays.
|
| The mind boggles. Have you READ the C standard?
|
| More than that. I've implemented it. Have you?

Some of it, in an extremely hostile environment. However, that is a lot
LESS than having written programs that get ported to radically different
systems - especially ones that you haven't heard of when you wrote the
code. And my code has been so ported, often without any changes needed.

| It's only when you implement the standard you realise many of the issues are
| irrelevant in practice. Take sequence points for example. They are not even
| modelled by most compilers, so whatever ambiguities there are, they simply
| cannot become an issue.

They are relied on, heavily, by ALL compilers that do any serious
optimisation. That is why I have seen many problems caused by them,
and one reason why HPC people still prefer Fortran.

| Similarly various standard pendantics are moaning
| about shifts not being portable, but they can never mention a compiler that
| fails to implement them as expected...

Shifts are portable if you code them according to the rules, and don't
rely on unspecified behaviour. I have used compilers that treated
signed right shifts as unsigned, as well as ones that used only the
bottom 5/6/8 bits of the shift value, and ones that raised a 'signal'
on left shift overflow. There are good reasons for all of the
constraints.

No, I can't remember which, offhand, but they included the ones for
the System/370 and Hitachi S-3600. But there were also some
microprocessor ones - PA-RISC? Alpha?

| Btw Do you happen to know the reasoning behind signed left shifts being
| undefined while right shifts are implementation defined.

Signed left shifts are undefined only if they overflow; that is undefined
because anything can happen (including the CPU stopping). Signed right
shifts are only implementation defined for negative values; that is
because they might be implemented as unsigned shifts.

| It will work as long as the compiler supports a 32-bit type - which it will of
| course. But in the infinitesimal chance it doesn't, why couldn't one
| emulate a 32-bit type, just like 32-bit systems emulate 64-bit types?

Because then you can't handle the 64-bit objects returned from the
library or read in from files! Portable programs will handle whatever
size of object the system supports, without change - 32-bit, 64-bit,
48-bit, 128-bit or whatever.

| Actually various other languages support sized types and most software
| used them long before C99. In many cases it is essential for correctness
| (imagine writing 32 bits to a peripheral when it expects 16 bits etc). So
| you really have to come up with some extraordinary evidence to explain
| why you think sized types are fundamentally wrong.

Not at all. That applies ONLY to the actual external interface, and
Terje and I have explained why C fixed-size types don't help.


Regards,
Nick Maclaren.
  #114  
Old August 14th 08, 03:08 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
John Larkin
external usenet poster
 
Posts: 307
Default Intel details future Larrabee graphics chip

On Thu, 14 Aug 2008 11:00:02 +0100, Martin Brown
wrote:

John Larkin wrote:
On Wed, 13 Aug 2008 10:40:47 +0100, Martin Brown
wrote:


I read that for a major bunch of Windows APIs, the only documantation
was the source code itself.


That is probably slightly unfair (but also partly true). It was the
unruly Windows message API that eventually killed the strongly typed
language interface for me. Just about every message was a pointer to a
"heaven knows what object" that you had to manually prod and probe at
runtime to work out its length and then what it claimed to be.
Maintaining the string typed interface definitions even with tools
became too much of a chore.

Imagine doing electronics where all components are uniform sized and
coloured and you have to unpeel the wrapper to see what is inside. Worse
still some of them may contain uninitialised pointers if you are
unlucky, or let you write off the end of them with disastrous results.


Hardware design keeps moving up in abstraction level too. I used to
design opamps and voltage regulators out of transistors. Now I'm
dropping sixteen isolated delta-sigma ADCs around an FPGA that talks
to a 32-bit processor. That's sort of equivalent to building a
software system using all sorts of other people's subroutines.


Libraries do exist in software too and some of them are very good. A lot
of code reuse projects do fail because the gatekeeper is pressurised to
put things in that are not of the required standard for reuse. I recall
one that was called the suppository by the engineers ordered to use it.

Using off the shelf reliable software components and making a living
selling them has not taken off the way it should have. Generally you
have to buy a whole library and licence.

And remember that your hardware design is done on a software tool...



Yes, I'm a great fan of design automation...

ftp://66.117.156.8/Auto.jpg


John

  #115  
Old August 14th 08, 03:52 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Martin Brown
external usenet poster
 
Posts: 33
Default Intel details future Larrabee graphics chip

John Larkin wrote:
On Thu, 14 Aug 2008 11:00:02 +0100, Martin Brown
wrote:

John Larkin wrote:
On Wed, 13 Aug 2008 10:40:47 +0100, Martin Brown
wrote:

And remember that your hardware design is done on a software tool...


Yes, I'm a great fan of design automation...

ftp://66.117.156.8/Auto.jpg


I like pencil and paper too.

I'd like to see you run a spice simulation on an HP-35!

Regards,
Martin Brown
** Posted from http://www.teranews.com **
  #116  
Old August 14th 08, 04:24 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
John Larkin
external usenet poster
 
Posts: 307
Default Intel details future Larrabee graphics chip

On Thu, 14 Aug 2008 15:52:23 +0100, Martin Brown
wrote:

John Larkin wrote:
On Thu, 14 Aug 2008 11:00:02 +0100, Martin Brown
wrote:

John Larkin wrote:
On Wed, 13 Aug 2008 10:40:47 +0100, Martin Brown
wrote:

And remember that your hardware design is done on a software tool...


Yes, I'm a great fan of design automation...

ftp://66.117.156.8/Auto.jpg


I like pencil and paper too.

I'd like to see you run a spice simulation on an HP-35!

Regards,
Martin Brown
** Posted from http://www.teranews.com **


The first sim I ever ran was of a steamship throttle control system,
run on the HP9100 programmable desktop calculator, the predecessor to
the HP35. I soon moved it to a PDP-8 running Focal, so I could plot
the loop step responses on the teletype. We got a *lot* of business,
maybe $100M, from those simulations.

I was recently coding a serial-input interrupt service routine in 68K
assembly. It does a bunch of character editing and handles a few
special cases... it presents a cleaned-up command string and a ready
flag to the mainline loop, which does other stuff in addition to
checking for serial commands and parsing/executing. It's only a page
of code, but I had to draw it (yes, on D-size vellum) to really get it
right. The paths turned out to look really strange. I reference the
numbered drawing in the code, and will include a jpeg with the program
source when it's formally released.

We don't need no stinkin' RTOS!

Another difference between engineers and programmers is that
engineering tends to be visual/parallel and programming tends to be
textual/sequential. It's fun to watch programmer types attack a logic
HDL; everything happens everywhere, all at once!

John

  #117  
Old August 15th 08, 02:39 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Jan Vorbrüggen[_2_]
external usenet poster
 
Posts: 2
Default Intel details future Larrabee graphics chip

That's not what happened. They hired David Cutler from DEC, where he
had worked on VMS, and pretty much left him alone. The chaos was and
is part of the culture of modern programming.


His work was significantly more disciplined when he worked for DEC than
what was the result from Redmond. But he didn't have a choice: Backward
compatibility, bug for bug and misfeature for misfeature, rule(d|s)
supreme in the Windows realm.

Jan
  #118  
Old August 15th 08, 02:52 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Nick Maclaren
external usenet poster
 
Posts: 72
Default Intel details future Larrabee graphics chip


In article ,
=?ISO-8859-1?Q?Jan_Vorbr=FCggen?= writes:
|
| That's not what happened. They hired David Cutler from DEC, where he
| had worked on VMS, and pretty much left him alone. The chaos was and
| is part of the culture of modern programming.
|
| His work was significantly more disciplined when he worked for DEC than
| what was the result from Redmond. But he didn't have a choice: Backward
| compatibility, bug for bug and misfeature for misfeature, rule(d|s)
| supreme in the Windows realm.

Oh, it was worse than that! After he had done the initial design
(which was reasonable, if not excellent), he was elbowed out, and
half of his design was thrown out to placate the god Benchmarketing.

The aspect that I remember was that the GUI was brought back from
where he had exiled it to the 'kernel' - and, as we all know, the
GUIs are the source of all ills on modern systems :-(


Regards,
Nick Maclaren.
  #119  
Old August 15th 08, 03:38 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
Chris M. Thomasson
external usenet poster
 
Posts: 46
Default Intel details future Larrabee graphics chip


"Jan Vorbrüggen" wrote in message
...
That's not what happened. They hired David Cutler from DEC, where he
had worked on VMS, and pretty much left him alone. The chaos was and
is part of the culture of modern programming.


His work was significantly more disciplined when he worked for DEC than
what was the result from Redmond.







But he didn't have a choice: Backward compatibility, bug for bug and
misfeature for misfeature, rule(d|s) supreme in the Windows realm.


;^(...

  #120  
Old August 15th 08, 03:47 PM posted to alt.comp.hardware.pc-homebuilt,comp.arch,sci.electronics.design
John Larkin
external usenet poster
 
Posts: 307
Default Intel details future Larrabee graphics chip

On 15 Aug 2008 13:52:54 GMT, (Nick Maclaren) wrote:


In article ,
=?ISO-8859-1?Q?Jan_Vorbr=FCggen?= writes:
|
| That's not what happened. They hired David Cutler from DEC, where he
| had worked on VMS, and pretty much left him alone. The chaos was and
| is part of the culture of modern programming.
|
| His work was significantly more disciplined when he worked for DEC than
| what was the result from Redmond. But he didn't have a choice: Backward
| compatibility, bug for bug and misfeature for misfeature, rule(d|s)
| supreme in the Windows realm.



They had special flags for running specific applications that tuned
the API bug set so the major apps could still run.



Oh, it was worse than that! After he had done the initial design
(which was reasonable, if not excellent), he was elbowed out, and
half of his design was thrown out to placate the god Benchmarketing.

The aspect that I remember was that the GUI was brought back from
where he had exiled it to the 'kernel' - and, as we all know, the
GUIs are the source of all ills on modern systems :-(



That did confuse me a little. The book has him holding out, against
Gates even, for a small kernel with a client-server relationship to
everything else, including all the graphics. The story ends happily
there, with nothing left to do but fix the circa 1000 bugs initially
shipped. I suppose the kernel was trashed/bloated later in the name of
speed.


John


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel details future 'Larrabee' graphics chip NV55 Intel 9 August 22nd 08 09:08 PM
Intel details future 'Larrabee' graphics chip NV55 AMD x86-64 Processors 9 August 22nd 08 09:08 PM
Intel details future 'Larrabee' graphics chip NV55 Nvidia Videocards 9 August 22nd 08 09:08 PM
Intel details future 'Larrabee' graphics chip NV55 Ati Videocards 9 August 22nd 08 09:08 PM
Intel details future -Larrabee- graphics chip NV55 General 7 August 7th 08 05:12 PM


All times are GMT +1. The time now is 11:01 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.