Sean.Perrin
Jul 14, 10:54 PM
Not a chance in the near future. Blu Ray and Sony are in utter shambles right now.
Sony really is in shambles... what is wrong with that company? They've really lost any focus they might have had and some terrible ideas in have come and inevitably gone. (Will the PS3 be next?).
Sony really is in shambles... what is wrong with that company? They've really lost any focus they might have had and some terrible ideas in have come and inevitably gone. (Will the PS3 be next?).
tekmoe
Jul 27, 02:57 PM
Actually, the merom in not completely compatible with the yonah chips. There will have to be some redesign on Apple's part that is supposed to delay the new MBPs. This article somewhat explains it:
http://blogs.zdnet.com/Apple/?p=249
Also, since Apple is now kind of competeing with PCs who get the newest and fastest, it would be in Apple's best interest to get these chips in MBPs asap. Also, it is easy to see that a lot of people are waiting to purchase a new Apple laptop with this technology. MBP's current sales are going to slump from here on out until this technology is put into some new computers.
this blog was also written by jason o'grady, aka the PowerPage rumor site. his writing means nothing to me.
http://blogs.zdnet.com/Apple/?p=249
Also, since Apple is now kind of competeing with PCs who get the newest and fastest, it would be in Apple's best interest to get these chips in MBPs asap. Also, it is easy to see that a lot of people are waiting to purchase a new Apple laptop with this technology. MBP's current sales are going to slump from here on out until this technology is put into some new computers.
this blog was also written by jason o'grady, aka the PowerPage rumor site. his writing means nothing to me.
RedTomato
Sep 13, 10:11 AM
Personally, I still see data transfer, namely from storage media, as a huge bottleneck in performance. Unless you are doing something really CPU intensive (vid editing, rendering, others) Most of the average "wait-time" is the damn hard drive.
Arrays of cheap RAM on a PCIe card?
The RAM companies don't seem interested in making wodges of slow cheap hi-cap ram, only in bumping up the speed and upping the capacity. For the last 10 years, a stick of decent RAM has always been about �100/ $100 no matter what the capacity / flavour of the moment is.
Even slow RAM is still orders of magnitude faster than a HD, hence my point. There's various historical and technical factors as to why we have the current situation.
I've also looked at RAID implementations (I run a RAID5) but each RAID level has its own problems.
I've recently seen that single-user RAID3 might be one way forward for the desktop, but don't really know enough about it yet.
Arrays of cheap RAM on a PCIe card?
The RAM companies don't seem interested in making wodges of slow cheap hi-cap ram, only in bumping up the speed and upping the capacity. For the last 10 years, a stick of decent RAM has always been about �100/ $100 no matter what the capacity / flavour of the moment is.
Even slow RAM is still orders of magnitude faster than a HD, hence my point. There's various historical and technical factors as to why we have the current situation.
I've also looked at RAID implementations (I run a RAID5) but each RAID level has its own problems.
I've recently seen that single-user RAID3 might be one way forward for the desktop, but don't really know enough about it yet.
mccldwll
Apr 27, 08:50 AM
No it's not.
And I think MOST people aren't blowing anything out of proportion. Being concerned about tracking information/privacy issues is important. Most people (stop generalizing just because some on this board are) are NOT over-reacting but were calling for deeper investigation into the issue.
Yes, it is. It's hardly tracking if distant towers are also logged. It's a minor issue. Logs need to be deleted after a short period of time. It will be done.
And I think MOST people aren't blowing anything out of proportion. Being concerned about tracking information/privacy issues is important. Most people (stop generalizing just because some on this board are) are NOT over-reacting but were calling for deeper investigation into the issue.
Yes, it is. It's hardly tracking if distant towers are also logged. It's a minor issue. Logs need to be deleted after a short period of time. It will be done.
wiestlingjr
Jun 9, 07:23 PM
Bibbz,
I have a couple questions.. I want to preorder with radioshack. I am NOT the primary account holder, but I am an authorized user. I also know the last 4 digits of the account holders social. Will this be a problem when picking up the phone?
I also have a FAN account. Will these be a problem?
I have a couple questions.. I want to preorder with radioshack. I am NOT the primary account holder, but I am an authorized user. I also know the last 4 digits of the account holders social. Will this be a problem when picking up the phone?
I also have a FAN account. Will these be a problem?
clevin
Aug 7, 05:37 PM
can't believe only 8 people voted for 64bit, its the most profound change here.... all others you can achieve with some 3rd party softwares.
boncellis
Aug 11, 01:43 PM
There's something fishy about this "story." The premise just seems unlikely.
That said, I think Apple will end up doing something about the gradual encroachment of their market share by mobile phone manufacturers. There are some qualifiers, however:
* It can't cannibalize iPod sales, which means either the "iPhone" will somehow be limited, or the iPod will see new features separating the two.
* It will have to be more than just a mobile phone with iTunes, integrating essential smartphone functions and something else that makes it stand out (maybe VoIP capability).
These are pretty obvious when you think about it, and I'm sure Apple has been thinking about it for some time. An Apple mobile phone could be imminent, you can sometimes tell by looking around the industry and spotting the "preemptive" or anticipatory products from competitors. It's not an accident that the LG "chocolate" phone looks a lot like the iPod Nano, in my opinion.
That said, I think Apple will end up doing something about the gradual encroachment of their market share by mobile phone manufacturers. There are some qualifiers, however:
* It can't cannibalize iPod sales, which means either the "iPhone" will somehow be limited, or the iPod will see new features separating the two.
* It will have to be more than just a mobile phone with iTunes, integrating essential smartphone functions and something else that makes it stand out (maybe VoIP capability).
These are pretty obvious when you think about it, and I'm sure Apple has been thinking about it for some time. An Apple mobile phone could be imminent, you can sometimes tell by looking around the industry and spotting the "preemptive" or anticipatory products from competitors. It's not an accident that the LG "chocolate" phone looks a lot like the iPod Nano, in my opinion.
rezenclowd3
Dec 9, 11:38 AM
Nuck, about the trigger travel: I am right. I wasn't asking about holding corner speed, and your tirade against me is very juvenile. When I pull just over 3/4 travel on the trigger, the in-game accelerator display is showing I am pulling full throttle. It was this way in Prologue as well.
eb6
Sep 19, 09:50 AM
Can't we stop all this Mac on Mac hate and just get along?:)
Benjy91
Mar 31, 02:30 PM
Lol, the fragmentation that "doesnt exist".
I knew it would bite them in the ass someday.
I knew it would bite them in the ass someday.
xxBURT0Nxx
Apr 7, 09:20 AM
I'm getting tired of Apple Mac's being INTEL's BIATCH!
Integrated graphics on a laptop costing THAT MUCH? PLEASE!
Steve Jobs should threaten to switch to AMD/ATI solutions even if just for leverage with Intel to get discreet graphics chips in these machines.
If this is true, this is a pathetic technology compromise in my opinion.
I would say the decision not to use discrete graphics is apples in order to save room inside the machine and make it small. If you want discrete graphics you can buy a macbook pro...? You make it seem like intel told apple they can't use the sb chips unless they use the IGP, which is obviously false. You are paying for the small, lightweight, portable laptop with the air, obviously not what's inside of it, save for maybe the SSD.
Integrated graphics on a laptop costing THAT MUCH? PLEASE!
Steve Jobs should threaten to switch to AMD/ATI solutions even if just for leverage with Intel to get discreet graphics chips in these machines.
If this is true, this is a pathetic technology compromise in my opinion.
I would say the decision not to use discrete graphics is apples in order to save room inside the machine and make it small. If you want discrete graphics you can buy a macbook pro...? You make it seem like intel told apple they can't use the sb chips unless they use the IGP, which is obviously false. You are paying for the small, lightweight, portable laptop with the air, obviously not what's inside of it, save for maybe the SSD.
kwyn
Jun 8, 06:49 PM
How bout Best Buy?
macse30
Apr 27, 07:59 AM
I wish they would leave it on and let me use it. I consider it a feature. It would help me track hours at job sites automatically for billing. I thought of writing an app just for that.
j_maddison
Jul 20, 11:53 AM
How fast do you want mail to go?
As fast as possible! Don't worry I do agree that e mail and browsing has very little to do with the processor speed, still you did ask the question! Now if only I could get a fibre link to my house without it costing a few hundred thousand Pounds a year hmm :rolleyes:
As fast as possible! Don't worry I do agree that e mail and browsing has very little to do with the processor speed, still you did ask the question! Now if only I could get a fibre link to my house without it costing a few hundred thousand Pounds a year hmm :rolleyes:

jljue
Apr 27, 08:44 AM
A lot of people are upset over this. But, no one seems to care that the US Government can snoop on any electronic communication it wants for well over 10 years now: http://en.wikipedia.org/wiki/Echelon_(signals_intelligence)
Data transmissions, cell phone calls, you name it. I think we're trying to cook the wrong goose if you ask me.
Law makers apparently have forgotten that they enacted a law requiring location ID on cell phones for emergency purposes--another indication that we have too many laws. :confused:
Data transmissions, cell phone calls, you name it. I think we're trying to cook the wrong goose if you ask me.
Law makers apparently have forgotten that they enacted a law requiring location ID on cell phones for emergency purposes--another indication that we have too many laws. :confused:
DotCom2
Apr 27, 09:25 AM
Problem is, if you turn "Location Services" off, then you can't use "Find My iPhone" which I think is quite a useful feature! :(
hulugu
Mar 18, 10:57 PM
What pacifist ever has a realistic chance of becoming the next "commander-in-chief"?
That's why 5P's contention is so ridiculous.
Candidates must paint themselves as "strong" and capable of leading our military, otherwise there'd be little chance they'd be elected as president.
Foreign adventurism is as American as apple pie, but post-World War II it's become a structural constant that no single president is going to change. Paul talks, but when it came down to actually withdrawing US troops from foreign bases, I seriously doubt that it would go as smoothly as fivepoint and Paul suggest.
It's a worthwhile consideration of Obama that he seems more hawk than dove these days, but I don't see another viable candidate from 2008 that would have done any better because these are difficult and complex problems.
That's why 5P's contention is so ridiculous.
Candidates must paint themselves as "strong" and capable of leading our military, otherwise there'd be little chance they'd be elected as president.
Foreign adventurism is as American as apple pie, but post-World War II it's become a structural constant that no single president is going to change. Paul talks, but when it came down to actually withdrawing US troops from foreign bases, I seriously doubt that it would go as smoothly as fivepoint and Paul suggest.
It's a worthwhile consideration of Obama that he seems more hawk than dove these days, but I don't see another viable candidate from 2008 that would have done any better because these are difficult and complex problems.
miketcool
Jul 20, 09:50 AM
You realize there are probably only four people on this board who are old enough to get that joke, right?
My quadra still runs, I guess I'm the forth party to get it.
This feels almost like an onion article:
Home Computer Gives Birth to Octuple-Cores
<enter photoshopped picture of a Mac Pro craddling its new born octuplets>
My quadra still runs, I guess I'm the forth party to get it.
This feels almost like an onion article:
Home Computer Gives Birth to Octuple-Cores
<enter photoshopped picture of a Mac Pro craddling its new born octuplets>
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
layte
Mar 31, 03:04 PM
From now on, companies hoping to receive early access to Google's most up-to-date software will need approval of their plans.
Emphasis on the important bit for those who didn't bother to actually read the article. If you want to wait a bit, you can get the code and do whatever you want. Well that's my reading of it anyway, but please, don't let get in the way of giving the new enemy number one a good kicking.
Emphasis on the important bit for those who didn't bother to actually read the article. If you want to wait a bit, you can get the code and do whatever you want. Well that's my reading of it anyway, but please, don't let get in the way of giving the new enemy number one a good kicking.
ghostlyorb
Apr 8, 08:17 AM
How many times does it need to be said, "don't screw around with Apple"?
Koufax80
Apr 25, 02:41 PM
Damnit! I just looked outside and saw Steve Jobs with a clipboard... Apple must have sent him to track my location since I turned my phone off...
Dr.Gargoyle
Aug 11, 05:10 PM
But you're really forgetting 1 thing. International Trade Mark/patent law is a pain in the @$$!!! I wouldn't blame Apple for 1 min to keep it in the US for at least a test run. That way they should be able to keep the patent breaker-reverse engineers off their back for a least a little while (i.e. why copy something if you can't even use it anywhere other than where it's patent protected).
If the rest of the world would get a handle on international trademarking and patent protection I don't think we'd have this issue of different standards of EU vs USA...
:confused: patent intrusion in europe??? Are you serious? Do you have any examples to verify your claims where a european company violated US patent law and this wasn't enforced by the european judicial system?
If the rest of the world would get a handle on international trademarking and patent protection I don't think we'd have this issue of different standards of EU vs USA...
:confused: patent intrusion in europe??? Are you serious? Do you have any examples to verify your claims where a european company violated US patent law and this wasn't enforced by the european judicial system?
icloud
Aug 7, 03:39 PM
Lots of things changed from the first views of tiger to the creature it is today. I think their a lot more hiding in leopard then we found out today
P.s. I hope to god a new finder and the death of brushed metal is one of those "secrets"
P.s. I hope to god a new finder and the death of brushed metal is one of those "secrets"