i3iz
Sep 26, 02:00 AM
old news...check this webpage:http://anandtech.com/mac/showdoc.aspx?i=2832&p=6
"We grabbed a pair of 2.4GHz Clovertown samples and tossed them in the system, and to our pleasure, they worked just fine. Our samples used a 1066MHz FSB, although we're expecting the final chip to use a 1333MHz FSB, but the most important part of the test is that all 8 cores were detected and functional. "
"We grabbed a pair of 2.4GHz Clovertown samples and tossed them in the system, and to our pleasure, they worked just fine. Our samples used a 1066MHz FSB, although we're expecting the final chip to use a 1333MHz FSB, but the most important part of the test is that all 8 cores were detected and functional. "
UnixMac
Oct 11, 09:04 AM
How does it run on an UltraSparc III 900?
How does it run on an Alpha?
Lets get an assortment of score, there could be a code bug for the G4, I am not an expert, but 10-20 times slower sounds like science fiction.
How does it run on an Alpha?
Lets get an assortment of score, there could be a code bug for the G4, I am not an expert, but 10-20 times slower sounds like science fiction.
bmullemeister
May 2, 06:23 PM
I just received an email with this site
http://www.zdnet.com/blog/bott/coming-soon-to-a-mac-near-you-serious-malware/3212?tag=nl.e589
Mac getting targetted after many years
Bert
http://www.zdnet.com/blog/bott/coming-soon-to-a-mac-near-you-serious-malware/3212?tag=nl.e589
Mac getting targetted after many years
Bert
iindigo
May 2, 10:20 AM
It is safer to run under an administrator account all the time in OS X than in Windows. On Windows, the administrator is almost the equivalent to the root account on *nixes and as such has unrestricted access to any and all files on the system.
On OS X and other *nix systems, however, the administrator account still can't do all that much without entering the root password. Admin accounts can't touch anything in the System folder. About the worst malware can do, even under an admin account in OS X, is one of the following:
1) Install itself in your user account Library folder
2) Install itself in the system's secondary Library folder (/Library/)
In both cases, the offending executables/libraries/whatever are easily removed - In the case of #1, create a new account and copy your old stuff over. In the case of #2, check the startup folder within, perhaps frameworks in some cases (though I have never seen malware that makes use of the OS X framework system) and delete the malware files. The files and folders contained in the Library folder are all nicely, neatly labeled and any malware should stick out like a sore thumb - it can't hide as something like EXPLORE32.EXE.
On OS X and other *nix systems, however, the administrator account still can't do all that much without entering the root password. Admin accounts can't touch anything in the System folder. About the worst malware can do, even under an admin account in OS X, is one of the following:
1) Install itself in your user account Library folder
2) Install itself in the system's secondary Library folder (/Library/)
In both cases, the offending executables/libraries/whatever are easily removed - In the case of #1, create a new account and copy your old stuff over. In the case of #2, check the startup folder within, perhaps frameworks in some cases (though I have never seen malware that makes use of the OS X framework system) and delete the malware files. The files and folders contained in the Library folder are all nicely, neatly labeled and any malware should stick out like a sore thumb - it can't hide as something like EXPLORE32.EXE.

ender land
Apr 23, 10:11 PM
I'm not sure I understand the point in the first part of your post so I'll have to skip that for now. Maybe you can phrase it a different way to help me out. Anyway, the whole "moral" issue has been raised and argued before. In my mind, there are many reasons why, logically, atheists are, by far, more moral then religious people. I'll just throw one out at you: your statement of someone who is a practicing theist has a "standard" of morals to abide by isn't something I can agree with for many reasons. One, why does one have to have a religious book to have a standard of morals. Atheists can know right and wrong and make laws based on common sense morals. We don't need some made up god to tell us what is right and wrong. Secondly, have you read some of the "morals" in the holy books. If so, and you still follow these rules, you have very low standards for what good morals should be. One needs to look no further then the section on how to treat your slaves in the bible to see this fact!
Ugh, so much ignorance (hopefully unintentional), I don't know where to start...
If you are theistic, clearly it would make sense to base morality off what your God believes. Not doing so would be the equivalent of an atheist not agreeing with the scientific method.
Everything you say is hinged upon the belief religions are all wrong. If this is in fact true, I suppose you having this belief is true. Though you could also debate this back and forth, IF religion is all wrong, any religious morals are therefore created by those who practiced/invented the religion, which means there are far more viewpoints having gone into the creation of such morals.
Thirdly, it doesn't even matter whether the above is true with respect to what you said, even if religion is 100% made up, people who are religious (I'll pick on GWB again since he was by far more practicing Christian than Obama) are still basing their beliefs on something which is written down. This makes them more trustworthy, or perhaps a better word would be predictable. It is unlikely that someone like GWB will suddenly ever go "you know what, I think you're right, it's totally ok to allow abortion" because his beliefs are based on something which will not change. On the other hand, a politician who is completely atheistic has no such 'check' or 'reference' which means you have no idea that their position will not change.
"Common sense morals?" lol! There are so many examples of morals not being "common sense" both inside and outside theistic cultures. These "common sense" morals are only common sense because you personally believe in them, at the current time, given your set of circumstances. It is entirely possible they drastically change over time. A great example is the one you pointed out, slavery. Plenty of people thought it was "common sense" to allow slavery. What changed? Did people suddenly get "more common sense?" It seems likely to me that something like abortion is likely to eventually become a "common sense to outlaw" thing, while gay marriage will become a "wtf does the government care" common sense thing; neither of these is the current state in the United States.
Not to mention, common sense morals more or less is exactly what I am referring to when saying societal morals. The "this is morality as we see it, duh!" type of morality.
Regarding your final point, I am almost positive I have read more of the Bible and understand what it is saying better than you. I am not going to debate a book you seemingly do not know with you, so I will offer this: there is a difference between Old Testament law and the New Testament in terms of how we, ie not Jews living more than 2300 years ago, should interpret them in our daily lives. Not to mention, much of the Old Testament was written to a specific group of people at a specific time (that was a long time ago), which even if New Testament did not "free" us from Old Testament law, that slavery was much different at the time in practice and implementation (see Leviticus 25). Plus if you do want to see how to treat slaves from a Biblical standpoint, in light of Christ, read the book of Philemon in the New Testament, which specifically is written to a slaveowner from Paul.
Ugh, so much ignorance (hopefully unintentional), I don't know where to start...
If you are theistic, clearly it would make sense to base morality off what your God believes. Not doing so would be the equivalent of an atheist not agreeing with the scientific method.
Everything you say is hinged upon the belief religions are all wrong. If this is in fact true, I suppose you having this belief is true. Though you could also debate this back and forth, IF religion is all wrong, any religious morals are therefore created by those who practiced/invented the religion, which means there are far more viewpoints having gone into the creation of such morals.
Thirdly, it doesn't even matter whether the above is true with respect to what you said, even if religion is 100% made up, people who are religious (I'll pick on GWB again since he was by far more practicing Christian than Obama) are still basing their beliefs on something which is written down. This makes them more trustworthy, or perhaps a better word would be predictable. It is unlikely that someone like GWB will suddenly ever go "you know what, I think you're right, it's totally ok to allow abortion" because his beliefs are based on something which will not change. On the other hand, a politician who is completely atheistic has no such 'check' or 'reference' which means you have no idea that their position will not change.
"Common sense morals?" lol! There are so many examples of morals not being "common sense" both inside and outside theistic cultures. These "common sense" morals are only common sense because you personally believe in them, at the current time, given your set of circumstances. It is entirely possible they drastically change over time. A great example is the one you pointed out, slavery. Plenty of people thought it was "common sense" to allow slavery. What changed? Did people suddenly get "more common sense?" It seems likely to me that something like abortion is likely to eventually become a "common sense to outlaw" thing, while gay marriage will become a "wtf does the government care" common sense thing; neither of these is the current state in the United States.
Not to mention, common sense morals more or less is exactly what I am referring to when saying societal morals. The "this is morality as we see it, duh!" type of morality.
Regarding your final point, I am almost positive I have read more of the Bible and understand what it is saying better than you. I am not going to debate a book you seemingly do not know with you, so I will offer this: there is a difference between Old Testament law and the New Testament in terms of how we, ie not Jews living more than 2300 years ago, should interpret them in our daily lives. Not to mention, much of the Old Testament was written to a specific group of people at a specific time (that was a long time ago), which even if New Testament did not "free" us from Old Testament law, that slavery was much different at the time in practice and implementation (see Leviticus 25). Plus if you do want to see how to treat slaves from a Biblical standpoint, in light of Christ, read the book of Philemon in the New Testament, which specifically is written to a slaveowner from Paul.
Don't panic
Mar 15, 08:25 PM
Continuous live timestamped text based updates:
http://www.bbc.co.uk/news/world-middle-east-12307698
(may be a different link tomorrow, but check on the front page for the current link to live updates)
http://www.guardian.co.uk/world/blog/2011/mar/15/japan-earthquake-and-tsunami-japan
(link changes each day, check on front page for the current day's link)
BBC is slightly slower but more accurate (but they beat the Guardian when announcing the 4th explosion).
thanks, this is useful
But there almost certainly must be spent fuel rods in all the basins, since fuel changes are done at least as often as 18 months and spent fuel takes two to four years to cool enough to be safely moved offsite. The fuel still contains enough U-235 to produce considerable heat from just decay, but internal pollutants reduce its ability to contribute in a reactive core. Presumably, spent fuel is not considered to be able/likely to generate a critical event (neutron flux is too compromised by pollutants) so it would not require such sturdy containment as would a reactor.
but the problem is that if they dry up, they heat up to the point of ignition and then you have a highly contaminant fire on your hands (to the point they can't even get close enough to stick one hose into the pool).

JENNIFER ANISTON HAIR COLOR

Jennifer Aniston Hairstyles

Jennifer Aniston#39;s Hair Color

Jennifer Aniston changes her

Jennifer Aniston Harpers

aniston hair color. b66db

brad pitt wallpapers 2010.

Jennifer Aniston color.

for the Jennifer Aniston

Celebrity Jennifer Aniston

jennifer aniston hair color.

Jennifer Aniston

hair color,
http://www.bbc.co.uk/news/world-middle-east-12307698
(may be a different link tomorrow, but check on the front page for the current link to live updates)
http://www.guardian.co.uk/world/blog/2011/mar/15/japan-earthquake-and-tsunami-japan
(link changes each day, check on front page for the current day's link)
BBC is slightly slower but more accurate (but they beat the Guardian when announcing the 4th explosion).
thanks, this is useful
But there almost certainly must be spent fuel rods in all the basins, since fuel changes are done at least as often as 18 months and spent fuel takes two to four years to cool enough to be safely moved offsite. The fuel still contains enough U-235 to produce considerable heat from just decay, but internal pollutants reduce its ability to contribute in a reactive core. Presumably, spent fuel is not considered to be able/likely to generate a critical event (neutron flux is too compromised by pollutants) so it would not require such sturdy containment as would a reactor.
but the problem is that if they dry up, they heat up to the point of ignition and then you have a highly contaminant fire on your hands (to the point they can't even get close enough to stick one hose into the pool).
mac jones
Mar 12, 03:58 AM
Hey, I've been hanging out on the forum for the iPad. But frankly i'm a little confused right now about what i just saw. From appearances (I mean appearances), the nuke plant in Japan BLEW UP, and they are lying about it if they say it's a minor issue. I don't want to believe this . You can see it with your own eyes, but i'm not sure exactly what i'm seeing. Certainly it isn't a small explosion.
Until I know what's really happening I'm officially, totally, freaked out......Any takers? :D
Until I know what's really happening I'm officially, totally, freaked out......Any takers? :D
Lennholm
May 2, 10:30 AM
Is your info from like 1993 ? Because this little known version of Windows dubbed "New Technology" or NT for short brought along something called the NTFS (New Technology File System) that has... *drumroll* ACLs and strict permissions with inheritance...
Unless you're running as administrator on a Windows NT based system, you're as protected as a "Unix/Linux" user. Of course, you can also run as root all the time under Unix, negating this "security".
So again I ask, what about Unix security protects you from these attacks that Windows can't do ?
And I say this as a Unix systems administrator/fanboy. The multi-user paradigm that is "Unix security" came to Windows more than 18 years ago. It came to consumer versions of Windows about 9 years ago if you don't count Windows 2000 as a consumer version.
Wait, knowledge is ignorance ? 1984 much ?
The fact is, understanding the proper terminology and different payloads and impacts of the different types of malware prevents unnecessary panic and promotes a proper security strategy.
I'd say it's people that try to just lump all malware together in the same category, making a trojan that relies on social engineering sound as bad as a self-replicating worm that spreads using a remote execution/privilege escalation bug that are quite ignorant of general computer security.
Great post! I think the biggest reason security has been so problematic on Windows, aside from the fact that it's the biggest target, is that the default user type is administrator.
The kind of issue in this case, caused by user ignorance, is really the only threat that exist for Windows since XP SP2. Internet Explorer has had sufficient, but very annoying, security measures against this since version 7 and I'm surprised Safari can let these kind of things slide through so easily.
Security in Windows has been pretty solid for years now, but that hasn't stopped many Linux/Unix/OSX-fanboys from claiming Windows security is like a swizz cheese. They don't even bother to do some research, they just keep shouting the same old mantra.
Unless you're running as administrator on a Windows NT based system, you're as protected as a "Unix/Linux" user. Of course, you can also run as root all the time under Unix, negating this "security".
So again I ask, what about Unix security protects you from these attacks that Windows can't do ?
And I say this as a Unix systems administrator/fanboy. The multi-user paradigm that is "Unix security" came to Windows more than 18 years ago. It came to consumer versions of Windows about 9 years ago if you don't count Windows 2000 as a consumer version.
Wait, knowledge is ignorance ? 1984 much ?
The fact is, understanding the proper terminology and different payloads and impacts of the different types of malware prevents unnecessary panic and promotes a proper security strategy.
I'd say it's people that try to just lump all malware together in the same category, making a trojan that relies on social engineering sound as bad as a self-replicating worm that spreads using a remote execution/privilege escalation bug that are quite ignorant of general computer security.
Great post! I think the biggest reason security has been so problematic on Windows, aside from the fact that it's the biggest target, is that the default user type is administrator.
The kind of issue in this case, caused by user ignorance, is really the only threat that exist for Windows since XP SP2. Internet Explorer has had sufficient, but very annoying, security measures against this since version 7 and I'm surprised Safari can let these kind of things slide through so easily.
Security in Windows has been pretty solid for years now, but that hasn't stopped many Linux/Unix/OSX-fanboys from claiming Windows security is like a swizz cheese. They don't even bother to do some research, they just keep shouting the same old mantra.
Chwisch87
Jun 7, 09:42 PM
what is the number one thing people actually use ... its the phone.
atnt here in ATL has gotten noticeably worse over the past month. It was already bad. This puts serious damper on my staying with atnt and switching over to verizon with android.
atnt here in ATL has gotten noticeably worse over the past month. It was already bad. This puts serious damper on my staying with atnt and switching over to verizon with android.
bigwig
Oct 27, 06:01 PM
At the rate SGI is going, I could probably buy SGI myself for whatever is in my pocket within the next year. Talk about a company that failed to follow the industry and adapt with the times.
Probably true, and quite sad really. SGI was a heck of a company in its day. I'm not sure they could have adapted. Once everybody else abandoned MIPS SGI couldn't afford new processor revisions by themselves, and the false promise that was (and is) Itanium irrevocably doomed them. Itanium basically killed off all the competition when the Unix vendors all hopped on the Itanium bandwagon, and Intel's complete failure to deliver on Itanium's promises looks in hindsight to have been Intel's plan all along. Just think of the performance a MIPS cpu would have were it given the development dollars x86 gets.
No point in anyone buying them, the only thing keeping them afloat is the few tidbits of technology they've licensed over the years, which is all just about obsolete now anyway.
SGI's technology isn't so much obsolete (who else sells systems with the capacity of an Altix 4700?) as it is unnecessary. 4 CPU Intel machines do just fine for 99.9% of people these days, and the kind of problems SGI machines are good at solving are a tiny niche. That's not just number crunching, a big SGI machine has I/O capacity that smokes a PC cluster.
Probably true, and quite sad really. SGI was a heck of a company in its day. I'm not sure they could have adapted. Once everybody else abandoned MIPS SGI couldn't afford new processor revisions by themselves, and the false promise that was (and is) Itanium irrevocably doomed them. Itanium basically killed off all the competition when the Unix vendors all hopped on the Itanium bandwagon, and Intel's complete failure to deliver on Itanium's promises looks in hindsight to have been Intel's plan all along. Just think of the performance a MIPS cpu would have were it given the development dollars x86 gets.
No point in anyone buying them, the only thing keeping them afloat is the few tidbits of technology they've licensed over the years, which is all just about obsolete now anyway.
SGI's technology isn't so much obsolete (who else sells systems with the capacity of an Altix 4700?) as it is unnecessary. 4 CPU Intel machines do just fine for 99.9% of people these days, and the kind of problems SGI machines are good at solving are a tiny niche. That's not just number crunching, a big SGI machine has I/O capacity that smokes a PC cluster.
Soculese
Sep 21, 10:58 AM
If it contains a HDD (a fact I am not entirely convinced of), I doubt it would be used for recording TV shows.
Programming such a device with a basic remote like the ones Steve Jobs previewed would be near-to-impossible.
If Apple did introduce the ability to record TV shows (which I also doubt), I believe it would be at the computer, only to be streamed to the iTV later.
OK, the tivo has a remote, but I NEVER use it to pick programs to record. I use the tivo.com website to do this. I would think that since the iTV will connect via wireless to your computer that you could do the same with it.
Programming such a device with a basic remote like the ones Steve Jobs previewed would be near-to-impossible.
If Apple did introduce the ability to record TV shows (which I also doubt), I believe it would be at the computer, only to be streamed to the iTV later.
OK, the tivo has a remote, but I NEVER use it to pick programs to record. I use the tivo.com website to do this. I would think that since the iTV will connect via wireless to your computer that you could do the same with it.
ChazUK
Feb 23, 02:32 PM
Android is going to do what Windows did. Those who like that Windows experience (read "cheap") are going to go in that direction. Those that want the elegant, minimalistic, rock solid OS, continue to stay with iPhone.
Define "cheap". The only people that save money are the manufacturers who have less licening fees with Android as it is open source. I know for one that the �420 (after 17.5% UK tax) I paid for my Nexus One was anything but "cheap".
One thing I did notice though, in any numbers comparisons. Apple sells one phone, with one OS, and currently with one carrier (a hated one, btw). Android is running on several phones, and many carriers. The actual comparison is flawed. Let me suggest this. If one gets a choice of 'Droid or iP, the iP will win out, even if the iP is a bit more expensive.
What about the rest of the world? iPhone is sold in multiple carriers outside the U.S.A. There is a whole worldwide market to dominate out there. Remember that the original article is citing "the global smart phone market by 2012".
On the subject of price, there is a good chance that Apple may be able to undercut others because they could be using their own chips, soon.
Would that not make the iPhone "cheap"? Nice to know that any money Apple can save to pass on to the customer is defined as "undercutting" yet when HTC, Samsung, Motorola, LG (et;al) are all "cheap" for using Android.
Define "cheap". The only people that save money are the manufacturers who have less licening fees with Android as it is open source. I know for one that the �420 (after 17.5% UK tax) I paid for my Nexus One was anything but "cheap".
One thing I did notice though, in any numbers comparisons. Apple sells one phone, with one OS, and currently with one carrier (a hated one, btw). Android is running on several phones, and many carriers. The actual comparison is flawed. Let me suggest this. If one gets a choice of 'Droid or iP, the iP will win out, even if the iP is a bit more expensive.
What about the rest of the world? iPhone is sold in multiple carriers outside the U.S.A. There is a whole worldwide market to dominate out there. Remember that the original article is citing "the global smart phone market by 2012".
On the subject of price, there is a good chance that Apple may be able to undercut others because they could be using their own chips, soon.
Would that not make the iPhone "cheap"? Nice to know that any money Apple can save to pass on to the customer is defined as "undercutting" yet when HTC, Samsung, Motorola, LG (et;al) are all "cheap" for using Android.

LightSpeed1
May 3, 06:40 PM
Looks like I'll stop using safari.
Sounds Good
Apr 10, 06:28 PM
However many of us who live in both OSes prefer Mac OS X on a Mac where it is appropriate.
The only "advantage" is being able to use OS X for the things it is good at.
I'm not sure sure what you mean when you say "for the things it is good at." What do you mean? What things?
The only "advantage" is being able to use OS X for the things it is good at.
I'm not sure sure what you mean when you say "for the things it is good at." What do you mean? What things?
PhantomPumpkin
Apr 21, 09:07 AM
You apparently didn't read the article you quoted.
That version of Skype (since fixed) did not itself send any private data, which by the way, it has your permission to access.
It had a bug in the file permissions it used for caching contact etc info, which meant that it was possible for someone to write an app to look at it, since Skype didn't encrypt their cache files. There's no evidence anyone did so, though.
Kind of like how iOS apparently has a bug where our location history is available to anyone who writes an app to look at it.
Skype did a good job of quickly fixing the bug, but that is hardly the case in EVERY app out there. It was one example a potential flaw, of which there have been many on Android devices.
That version of Skype (since fixed) did not itself send any private data, which by the way, it has your permission to access.
It had a bug in the file permissions it used for caching contact etc info, which meant that it was possible for someone to write an app to look at it, since Skype didn't encrypt their cache files. There's no evidence anyone did so, though.
Kind of like how iOS apparently has a bug where our location history is available to anyone who writes an app to look at it.
Skype did a good job of quickly fixing the bug, but that is hardly the case in EVERY app out there. It was one example a potential flaw, of which there have been many on Android devices.
bigwig
Oct 27, 06:08 PM
Multimedia, I was wondering if you could address the FSB issue being discussed by a few people here, namely how more and more cores using the same FSB per chip can push only so much data through that 1333 MHZ pipe, thereby making the FSB act as a bottleneck. Any thoughts?
I don't know if Intel ever changed it, but one of the historical reasons you couldn't make a scalable multi-cpu x86 system is that x86s did bus snooping. Once you got more than ~3-4 x86s on the same bus the bus would be saturated by snooping traffic and there would be little room for real data. I think that's why Intel is pushing multi-core so much, it's a hack to work around Intel's broken bus. The RISC cpus (MIPS et al) didn't do that, that's why all the high cpu count systems used them.
I don't know if Intel ever changed it, but one of the historical reasons you couldn't make a scalable multi-cpu x86 system is that x86s did bus snooping. Once you got more than ~3-4 x86s on the same bus the bus would be saturated by snooping traffic and there would be little room for real data. I think that's why Intel is pushing multi-core so much, it's a hack to work around Intel's broken bus. The RISC cpus (MIPS et al) didn't do that, that's why all the high cpu count systems used them.
appleguy123
Apr 22, 07:50 PM
This makeup of this forum's members intrigues mean slightly. Why are most of the posters here Atheists? Is it part of the Mac using demographic, the Internet in general's demographic, or are Atheists just the most interested in Politics, Religon, and Social Issues?
flopticalcube
Apr 24, 10:43 AM
That's true. I think, though, if anything, the hatred of another religion was a pretty strong motivational force in the US armed forces since 9/11. Especially right after, when many people joined up to fight the Muslims who attacked the USA.
Would attribute that to a personal religious motivation as opposed to an institutional one. Muslims serve in the US forces as well.
Would attribute that to a personal religious motivation as opposed to an institutional one. Muslims serve in the US forces as well.
blahblah100
Apr 28, 09:19 AM
Some people around here flip-flop on the issue depending on the latest stats.
Don't be fooled.
Next quarter you'll see very, very different numbers. Over the next 3-5 years you'll see the decline of the entire PC market and a shift over to tablets and pad devices as they become more capable and powerful. The ecosystem is already in place. The content distribution model is already in place. Look what you can already do with an iPad. Mirror games onto HDTVs. Photoshop on the iPad. The list goes on. And note how quickly this all happened.
And with a PC, you can actually make the iPad work. :)
Don't be fooled.
Next quarter you'll see very, very different numbers. Over the next 3-5 years you'll see the decline of the entire PC market and a shift over to tablets and pad devices as they become more capable and powerful. The ecosystem is already in place. The content distribution model is already in place. Look what you can already do with an iPad. Mirror games onto HDTVs. Photoshop on the iPad. The list goes on. And note how quickly this all happened.
And with a PC, you can actually make the iPad work. :)
bretm
Sep 20, 11:23 AM
I was going to ask why not a PRV, but realized it myself. While apple does not prevent you from loading music you have aquired through other means onto your iPod, they don't help you either. They don't help you buy CD's because its too broad an experience to simplfy. Same with the PVR. How a customer aquires content from a provider varies too much for apple to engineer an simple solution. But they can provide their own simple content delivery solution.
Next, they need to provide an NAS for all your media either from the store, ripped from disc or created yourself. Move the media off the computer.
?? TiVo will provide you a PVR that burns DVDs, has a tuner and hard drive, and wirelessly connects to your macintosh and plays your photo library and itunes for $300 plus you have to buy a usb network reciever for like $25.
So it's basically the same thing except for the videos which of course didn't exist when tivo adopted the technology, and since they'll play your photos they'll probalby adopt the videos too. I think I'll just hold out for my TiVo to do the same thing PLUS be a PVR and DVD burner.
Next, they need to provide an NAS for all your media either from the store, ripped from disc or created yourself. Move the media off the computer.
?? TiVo will provide you a PVR that burns DVDs, has a tuner and hard drive, and wirelessly connects to your macintosh and plays your photo library and itunes for $300 plus you have to buy a usb network reciever for like $25.
So it's basically the same thing except for the videos which of course didn't exist when tivo adopted the technology, and since they'll play your photos they'll probalby adopt the videos too. I think I'll just hold out for my TiVo to do the same thing PLUS be a PVR and DVD burner.
Iscariot
Mar 25, 04:50 PM
And...?
I'm far from the first or only person who has deviated from the original topic. You can either move with the discussion, or virtually everything from page 2 on is off-topic. For those of you playing at home, the goalposts have now been moved from hatred to violence to violence specifically from a catholic source to violence specifically from a "real" catholic.
IIRC, you're also the one that made up a statistic
Despite your disregard for the pretext of civility, my source was wikipedia, which I did in fact cite in post #27. I'll thank you not to make unfounded accusations.
I'm far from the first or only person who has deviated from the original topic. You can either move with the discussion, or virtually everything from page 2 on is off-topic. For those of you playing at home, the goalposts have now been moved from hatred to violence to violence specifically from a catholic source to violence specifically from a "real" catholic.
IIRC, you're also the one that made up a statistic
Despite your disregard for the pretext of civility, my source was wikipedia, which I did in fact cite in post #27. I'll thank you not to make unfounded accusations.
dgree03
Apr 28, 09:30 AM
Let me try to explain what I mean from a different angle:
The number of PCs being sold could remain constant and still fall behind tablet sales in the future. Why? The market expands. Think about who could use a mainframe back in the day. Very few companies. Then minicomputers came along and suddenly many more companies could get one. The market expanded, and even if mainframe sales remained constant, minicomputer sales surpassed them.
Tablets will appeal to those who never got comfortable with PCs. Or who never bothered getting one at all. I've personally seen toddlers and 80-year-olds gravitate toward the iPad naturally. It just fits them perfectly. There's none of that artificial abstraction of a keyboard or mouse between their fingers and the device, they just interact directly. It appeals to them.
Someone who uses a PC almost exclusively for email and web surfing will find a tablet appealing to them.
Programmers and professional writers used to keyboards will not find a tablet appealing to them. Not yet, at least.
So when the market balloons yet again to take in the Tablet Era, PCs will continue to be sold, but the number of users in this new market will be larger than the market that existed in the PC Era. Many PC users will move to tablets, and many folks who never enjoyed (or even used) PCs will grab a tablet. It will be bigger than the PC market by 2020.
And by the way, the price premium referred to earlier in this thread? That's unique to Macs versus PCs because Apple does not compete in the low-end of the market. But in the smart phone and tablet markets, there is NO price premium. One day people will forget that Apple ever made "high-priced" items since it simply won't be true compared with the competition.
As for Apple never making headway, they are merely the most profitable computer company on the planet. Nice lack of headway if you can get it.
Oh i completely understand what you mean, thanks for the further clarification.
Lets not forget that we are dealing with a more "computer" savvy generation. Your examples of 80yr olds and infants is generally correct, but when those infants get to school, they will be using desktops(at school.) I think the barrier that existed with PC emergence in the late 80's is still prevalent today, not with the youger crowd anyway.
I think it will get to the point where people will have multiple devices in their homes. Just like people have laptops, desktops, and tablets(like myself) They will each have a place, but I just dont think tablets will run desktops and laptops out of peoples homes and time in the next 10-15 years.
The number of PCs being sold could remain constant and still fall behind tablet sales in the future. Why? The market expands. Think about who could use a mainframe back in the day. Very few companies. Then minicomputers came along and suddenly many more companies could get one. The market expanded, and even if mainframe sales remained constant, minicomputer sales surpassed them.
Tablets will appeal to those who never got comfortable with PCs. Or who never bothered getting one at all. I've personally seen toddlers and 80-year-olds gravitate toward the iPad naturally. It just fits them perfectly. There's none of that artificial abstraction of a keyboard or mouse between their fingers and the device, they just interact directly. It appeals to them.
Someone who uses a PC almost exclusively for email and web surfing will find a tablet appealing to them.
Programmers and professional writers used to keyboards will not find a tablet appealing to them. Not yet, at least.
So when the market balloons yet again to take in the Tablet Era, PCs will continue to be sold, but the number of users in this new market will be larger than the market that existed in the PC Era. Many PC users will move to tablets, and many folks who never enjoyed (or even used) PCs will grab a tablet. It will be bigger than the PC market by 2020.
And by the way, the price premium referred to earlier in this thread? That's unique to Macs versus PCs because Apple does not compete in the low-end of the market. But in the smart phone and tablet markets, there is NO price premium. One day people will forget that Apple ever made "high-priced" items since it simply won't be true compared with the competition.
As for Apple never making headway, they are merely the most profitable computer company on the planet. Nice lack of headway if you can get it.
Oh i completely understand what you mean, thanks for the further clarification.
Lets not forget that we are dealing with a more "computer" savvy generation. Your examples of 80yr olds and infants is generally correct, but when those infants get to school, they will be using desktops(at school.) I think the barrier that existed with PC emergence in the late 80's is still prevalent today, not with the youger crowd anyway.
I think it will get to the point where people will have multiple devices in their homes. Just like people have laptops, desktops, and tablets(like myself) They will each have a place, but I just dont think tablets will run desktops and laptops out of peoples homes and time in the next 10-15 years.
sinsin07
Apr 9, 07:43 AM
Apple should be courting game developers, not their execs. These execs usually don't know much games other than to milk franchises until they're useless while the gameplay suffers.
AidenShaw
Oct 8, 10:23 AM
Faster at what? I'm too lazy to find the part in the keynote where they showed this. Was it 20% faster at something designed to use all 8 cores?
The task was a multi-threaded matrix multiplication that easily scales to multiple cores.
This is representative of many HPC and rendering apps, but not as realistic for most desktop apps (unless, of course, you're like MultiMedia and run several separate instances of the desktop apps simulataneously).
The sections in the video are at 11:50 to 15:00, and 26:30 to 28:00. (The gap is while the engineer is swapping CPUs and rebooting.)
My earlier numbers were a bit off - rewatching the video the Woodie system was 40% faster than the Opteron, at 17% less power. The Clovertowns were low-voltage parts "about 900MHz" slower than the Woodies. The octo (dual quads) was about 60% faster than the Opteron at 17% less power. (I'd like to have seen them put in faster Clovertowns, and show what the octo Clovertown would do when matching the power draw of the Opteron.)
At about 25:00 minutes in, Gelsinger says that the "two woodies in one socket" is the "right way to do quad-core at 65nm", due to manufacturing and yield issues.
The task was a multi-threaded matrix multiplication that easily scales to multiple cores.
This is representative of many HPC and rendering apps, but not as realistic for most desktop apps (unless, of course, you're like MultiMedia and run several separate instances of the desktop apps simulataneously).
The sections in the video are at 11:50 to 15:00, and 26:30 to 28:00. (The gap is while the engineer is swapping CPUs and rebooting.)
My earlier numbers were a bit off - rewatching the video the Woodie system was 40% faster than the Opteron, at 17% less power. The Clovertowns were low-voltage parts "about 900MHz" slower than the Woodies. The octo (dual quads) was about 60% faster than the Opteron at 17% less power. (I'd like to have seen them put in faster Clovertowns, and show what the octo Clovertown would do when matching the power draw of the Opteron.)
At about 25:00 minutes in, Gelsinger says that the "two woodies in one socket" is the "right way to do quad-core at 65nm", due to manufacturing and yield issues.