drewyboy
Aug 29, 01:34 AM
Me personally, I like how the MBP are, but just give the Al. the black Al. such as the new U2 ipods have on the back. Very sleek and sexy.
corywoolf
Sep 5, 03:32 PM
Showtime is owned by Viacom,who also owns the movie channel and a cable company.
Who also owns the trademark "Mighty Mouse", which Apple licensed. I think he might be onto something. More then likely, the product is called "Showtime".
Who also owns the trademark "Mighty Mouse", which Apple licensed. I think he might be onto something. More then likely, the product is called "Showtime".
aegisdesign
Sep 10, 04:47 PM
1024 CPUs??? WOW... and I thought I had nasty simulations. :o
Still, dont you think that it is a terrible waste of computing power if the app doesnt take advantage of multiple processors, eventhough it might be very hard to write such an app? This is really not my field and I know far too little to have an opinion, so take it for what it is worth.
You had to explicitly write your applications in a special parallel computing version of Fortran or OCCAM. It was exceptionally quick at matrices and vector equations so working out the weather was one of the things it was good at. They did a later DAP with 4096 processors. :-)
The point is, multiple cores are only of use if you've a task that can be split up into separate threads. Many general purpose computing tasks simply can't be multi threaded easily or at all.
On the Mac though, the main advantage of at least two cores is that the OS can run the WindowServer task, that handles all your windows on screen and generally consumes a lot of CPU when you've got 16 apps running on your screen on one CPU and your application on another and it's still nippy so you don't get the beachball so often switching apps. The second core can also be doing something like running backups, indexing a hard drive for Spotlight, hotclustering files, updating thumbnails in iPhoto.... Past two cores and you're in diminishing returns except for specific applications that can be multithreaded.
The one advantage Macs have had for a few years of course is that there is a long history of dual CPU machines. Windows on the other hand rarely has multi threaded applications. Both OS's are a pain in the arse to write multi threaded apps for though. The wisdom of BeOS's designers would work wonders with today's CPUs.
Still, dont you think that it is a terrible waste of computing power if the app doesnt take advantage of multiple processors, eventhough it might be very hard to write such an app? This is really not my field and I know far too little to have an opinion, so take it for what it is worth.
You had to explicitly write your applications in a special parallel computing version of Fortran or OCCAM. It was exceptionally quick at matrices and vector equations so working out the weather was one of the things it was good at. They did a later DAP with 4096 processors. :-)
The point is, multiple cores are only of use if you've a task that can be split up into separate threads. Many general purpose computing tasks simply can't be multi threaded easily or at all.
On the Mac though, the main advantage of at least two cores is that the OS can run the WindowServer task, that handles all your windows on screen and generally consumes a lot of CPU when you've got 16 apps running on your screen on one CPU and your application on another and it's still nippy so you don't get the beachball so often switching apps. The second core can also be doing something like running backups, indexing a hard drive for Spotlight, hotclustering files, updating thumbnails in iPhoto.... Past two cores and you're in diminishing returns except for specific applications that can be multithreaded.
The one advantage Macs have had for a few years of course is that there is a long history of dual CPU machines. Windows on the other hand rarely has multi threaded applications. Both OS's are a pain in the arse to write multi threaded apps for though. The wisdom of BeOS's designers would work wonders with today's CPUs.
sushi
Aug 23, 10:28 PM
Apple makes money off of iTunes - they won't tell us how much, but it is a money maker (all be it insignificant compared to the iPod)
I think that you mean Apple makes money off of iTMS (iTunes Music Store). And yes, it would be interesting to know how much they really make.
I think that you mean Apple makes money off of iTMS (iTunes Music Store). And yes, it would be interesting to know how much they really make.
IJ Reilly
Aug 23, 08:28 PM
As has been mentioned the typical patent litigation is in the $5-$10 M range paid to the attorneys. With the main lawsuit and 5 countersuits they could have made a big dent in that $100M. Even when you have a large legal staff, litigation is usually handled by outside firms that specialize in those kinds of trials. With 32 million iPods sold in 2005 even a $3 licensing fee (~1% on average is not an atypical licensing fee) you'd easily surpass $100M if you were planning to sell iPods for more than 1 more year. A lump sum is preferable.
There are also less obvious or tangible costs. Uncertainty is never good buyers may shy away from a purchase if they feel there is a potential that the product will soon be abandoned/unavailable. There's also the fact that the discovery process in such lawsuits is often used as a tool to try and pry information out from the other side, such as future product plans, etc. that might well be worth big $ keeping undr wraps. And last but not least is the distraction that such a suit tends to place on the key employees who may be involved in designing a workaround or simply being deposed and directly involved with the trial.
B
True, but let's put it this way: Apple didn't settle for $100 million because winning would have cost them as much as 10% of that sum. Remember, Apple was going up against a much smaller company with far less in the way of resources. If Apple could have ground Creative down over years of protracted litigation with some assurance of getting a better deal, then I have little doubt that they probably would have done so. I suspect Apple saw a RIM-like situation, where they were unlikely to prevail in court and in the meantime the litigation environment would create opportunities for competitors.
There are also less obvious or tangible costs. Uncertainty is never good buyers may shy away from a purchase if they feel there is a potential that the product will soon be abandoned/unavailable. There's also the fact that the discovery process in such lawsuits is often used as a tool to try and pry information out from the other side, such as future product plans, etc. that might well be worth big $ keeping undr wraps. And last but not least is the distraction that such a suit tends to place on the key employees who may be involved in designing a workaround or simply being deposed and directly involved with the trial.
B
True, but let's put it this way: Apple didn't settle for $100 million because winning would have cost them as much as 10% of that sum. Remember, Apple was going up against a much smaller company with far less in the way of resources. If Apple could have ground Creative down over years of protracted litigation with some assurance of getting a better deal, then I have little doubt that they probably would have done so. I suspect Apple saw a RIM-like situation, where they were unlikely to prevail in court and in the meantime the litigation environment would create opportunities for competitors.
HecubusPro
Aug 31, 02:53 PM
I posted this on the mini specs forum, but thought it would be needed here as well.
Are they already shipping then?
http://www.appleinsider.com/article.php?id=2010
Are they already shipping then?
http://www.appleinsider.com/article.php?id=2010
Peikko
Apr 30, 08:33 PM
MSFT has not had a real hit in forever.
Can't be bothered to check anything but the most recent past, so...
Kinect Confirmed As Fastest-Selling Consumer Electronics Device (http://community.guinnessworldrecords.com/_Kinect-Confirmed-As-Fastest-Selling-Consumer-Electronics-Device/blog/3376939/7691.html)
Guinness World Records, the global authority on record breaking, today confirm that the Kinect for the Xbox 360 is the Fastest-Selling Consumer Electronics Device. The hardware, that allows controller-free gaming, sold through an average of 133,333 units per day, for a total of 8 million units in its first 60 days on sale from 4 November 2010 to 3 January 2011.
The sales figures outstrip both the iPhone and the iPad for the equivalent periods after launch. [...]
Can't be bothered to check anything but the most recent past, so...
Kinect Confirmed As Fastest-Selling Consumer Electronics Device (http://community.guinnessworldrecords.com/_Kinect-Confirmed-As-Fastest-Selling-Consumer-Electronics-Device/blog/3376939/7691.html)
Guinness World Records, the global authority on record breaking, today confirm that the Kinect for the Xbox 360 is the Fastest-Selling Consumer Electronics Device. The hardware, that allows controller-free gaming, sold through an average of 133,333 units per day, for a total of 8 million units in its first 60 days on sale from 4 November 2010 to 3 January 2011.
The sales figures outstrip both the iPhone and the iPad for the equivalent periods after launch. [...]
Fast Shadow
Apr 25, 04:00 PM
I really can't say enough good things about my new MBP 17. If next year brings a redesign then it will need to be one hell of an improvement to get me to switch, because this thing has impressed me so much more than I expected.
jsw
Oct 28, 07:35 PM
I'm not going to wade through all of the posts here and delete another few dozen off-topic ones, but it's clear that this thread is incapable of staying on-topic, which is a requirement in the news forums, and so it's being closed.
Silentwave
Jul 15, 04:28 PM
I know that it is a desktop chip but I would expect that a site like anandtech or tomshardware would check againt the core duo just to see how much the difference is between the two "core" CPU.
Why?
Mobile vs. desktop
32 bit vs. 64 bit
Pentium M architecture vs. Intel Core microarchitecture (yes, Yonah uses the latest version of the pentium M architecture, far more efficient than netburst)
and I doubt very much they have comparable Mobos/ machines to test them on.
Why?
Mobile vs. desktop
32 bit vs. 64 bit
Pentium M architecture vs. Intel Core microarchitecture (yes, Yonah uses the latest version of the pentium M architecture, far more efficient than netburst)
and I doubt very much they have comparable Mobos/ machines to test them on.
andrew050703
Sep 21, 10:05 AM
sorry to interrupt on the network discussion, but has anyone got anything new to share/discuss on the iPhone (unless I read the thread wrong ;))?
Can it really offer all that functionality (from the patent report) in a candybar style phone, or will they have to release two - a funtional one, for ipodding/texting/phoning; and a pda for office work on the move (& everything else?
Can it really offer all that functionality (from the patent report) in a candybar style phone, or will they have to release two - a funtional one, for ipodding/texting/phoning; and a pda for office work on the move (& everything else?
blahblah100
Mar 30, 12:48 PM
Sue M$
What about App�� ?
What about App�� ?
tsugaru
Mar 22, 04:03 PM
Is it necessary these days? Back in 1999 it was still difficult days to just get video going at a good rate. These days it isn't hard to get good graphics.
What would be the use of redundant graphics? It must be a very small wedge of the market.
Not everyone wants a GPU for graphics. Some want it for games. Some want it for CUDA/OpenCL. Some want it for future proofing (kind of an oxymoron with a Mac, but that's another story.)
Apple has been pushing the 'good enough' mantra on iOS users for a while now. Specs aren't the greatest, but it's good enough and the software does well. It doesn't always work that way in (real) computers. There are always customers who want/need a bit more.
And not redundant graphics. I had meant Apple should be doing the latest and greatest again regarding their GPUs, like they did way way way back when. I should have just typed it all out. I'm in a slight food induced coma now.
What would be the use of redundant graphics? It must be a very small wedge of the market.
Not everyone wants a GPU for graphics. Some want it for games. Some want it for CUDA/OpenCL. Some want it for future proofing (kind of an oxymoron with a Mac, but that's another story.)
Apple has been pushing the 'good enough' mantra on iOS users for a while now. Specs aren't the greatest, but it's good enough and the software does well. It doesn't always work that way in (real) computers. There are always customers who want/need a bit more.
And not redundant graphics. I had meant Apple should be doing the latest and greatest again regarding their GPUs, like they did way way way back when. I should have just typed it all out. I'm in a slight food induced coma now.
0815
Apr 20, 01:50 PM
Enough with the chicken little episodes already.
Apparently, this is related to AT&T only and it is not based on GPS location services but rather a database of cell towers. It contains no identifiable information and is sent to AT&T for analysis for signal strength statistics.
Since it does not contain personal information and is being used to analyze the state of the AT&T network, I don't see a problem here. People who are not inside of the US are not affected by this.
If you think that this is a privacy concern then you need to have your head examined. It is anonymous statistical information and nothing more.
*edit*
It is possible that this information was being collected for an AT&T app that you could download a while back and the OS is still collecting it in the background regardless of whether you have the app installed. Am I crazy or is there an AT&T app that consumes this data on the app store?
Would you mind telling us the source of this knowledge? While I have read that this happens only on GSM devices, I know from my device that it had data from all over the world. But most interesting would be where you found the information that that files gets shared with AT&T.
Apparently, this is related to AT&T only and it is not based on GPS location services but rather a database of cell towers. It contains no identifiable information and is sent to AT&T for analysis for signal strength statistics.
Since it does not contain personal information and is being used to analyze the state of the AT&T network, I don't see a problem here. People who are not inside of the US are not affected by this.
If you think that this is a privacy concern then you need to have your head examined. It is anonymous statistical information and nothing more.
*edit*
It is possible that this information was being collected for an AT&T app that you could download a while back and the OS is still collecting it in the background regardless of whether you have the app installed. Am I crazy or is there an AT&T app that consumes this data on the app store?
Would you mind telling us the source of this knowledge? While I have read that this happens only on GSM devices, I know from my device that it had data from all over the world. But most interesting would be where you found the information that that files gets shared with AT&T.
JGowan
Aug 23, 08:36 PM
You have to wonder how tenuous Apple's position was considering that they have settled so early (in huge lawsuit time). 100 million dollars is a lot of money to spend to get Creative off their back.Well, it wasn't just this lawsuit. Five lawsuits were settled @ $20M a suit + no distractions of dragging this out... Plus they now are paid up FOREVER to use this license + they could recoup some money if Licenses are granted to others... doesn't sound as drastic as $100M is suddenly down the toilet. There's some value there for Apple.
munkery
Jan 13, 01:41 PM
There's nothing to set up. You should increase the setting to maximum when you first install Windows 7, but other than that it has nothing to do with playing games online.
You should have a unique identifier (password) attached to authentication mechanism (UAC in Windows). So, Windows users should run as standard users. But, using a standard account in Windows causes issues with some software, such as some online games, that require admin accounts (or "run as administrator"; superuser) to function. Many online games on Windows 7 still require running as Administrator (superuser privileges) to function. This requires setting the "Properties" to allow "run as Administrator" or turning off UAC. This is risky as the games connect to remote servers and download content. Trojans are installed without authentication if accessed with superuser privileges. This example, using online games, shows the problem with how software is being written for Windows. This problem lead to DLL hijacking exploits (http://www.computerworld.com/s/article/9181513/Hacking_toolkit_publishes_DLL_hijacking_exploit). You definitely need good antivirus software in Windows to more safely play games that require Administrator privileges.
The issue with online games found in Windows is not problematic on Mac OS X given that software for Mac is written following the guidelines of the principle of least privilege (https://secure.wikimedia.org/wikipedia/en/wiki/Principle_of_least_privilege) more so than Windows software. For example, I have played online FPS games on my Mac with standard account privileges that require "run as Administrator" (superuser privileges) in Windows systems. Mac OS X is much better insulated from Malware.
Flash, Adobe, Java, etc. all have virtually identical issues under all three OSes. It's rare you see something that only affects one, unless it's a significantly different program.
Vulnerabilities in those components in Mac OS X are attributed as OS X vulnerabilities because OS X includes them by default so this artificially inflates the number of vulnerabilities in OS X when looking at vulnerability comparisons. These components have worse security in Windows. How these vulnerabilities manifest in Windows is through Internet Explorer.
justin bieber and selena gomez
selena gomez hot photos
SELENA GOMEZ 2011 PHOTOSHOOT
You should have a unique identifier (password) attached to authentication mechanism (UAC in Windows). So, Windows users should run as standard users. But, using a standard account in Windows causes issues with some software, such as some online games, that require admin accounts (or "run as administrator"; superuser) to function. Many online games on Windows 7 still require running as Administrator (superuser privileges) to function. This requires setting the "Properties" to allow "run as Administrator" or turning off UAC. This is risky as the games connect to remote servers and download content. Trojans are installed without authentication if accessed with superuser privileges. This example, using online games, shows the problem with how software is being written for Windows. This problem lead to DLL hijacking exploits (http://www.computerworld.com/s/article/9181513/Hacking_toolkit_publishes_DLL_hijacking_exploit). You definitely need good antivirus software in Windows to more safely play games that require Administrator privileges.
The issue with online games found in Windows is not problematic on Mac OS X given that software for Mac is written following the guidelines of the principle of least privilege (https://secure.wikimedia.org/wikipedia/en/wiki/Principle_of_least_privilege) more so than Windows software. For example, I have played online FPS games on my Mac with standard account privileges that require "run as Administrator" (superuser privileges) in Windows systems. Mac OS X is much better insulated from Malware.
Flash, Adobe, Java, etc. all have virtually identical issues under all three OSes. It's rare you see something that only affects one, unless it's a significantly different program.
Vulnerabilities in those components in Mac OS X are attributed as OS X vulnerabilities because OS X includes them by default so this artificially inflates the number of vulnerabilities in OS X when looking at vulnerability comparisons. These components have worse security in Windows. How these vulnerabilities manifest in Windows is through Internet Explorer.
Eidorian
Apr 14, 01:10 PM
The real question that I haven't seen anyone ask, is will this be Intel only or will other chipsets/manufacturers support it as well.It appears to be Intel only for now and it is a rather large controller compared to USB 3.0 ones.
Intel gave many other vendors a field day for profits by not supporting USB 3.0 on their PCH. Though this did drive boards costs up and certain vendors preferred to wait for Intel to simply include support. To be honest, it only appears to be Apple.
Intel gave many other vendors a field day for profits by not supporting USB 3.0 on their PCH. Though this did drive boards costs up and certain vendors preferred to wait for Intel to simply include support. To be honest, it only appears to be Apple.
Huntn
Apr 11, 10:47 PM
If you try to imagine the future of society and governance, you either are going to regress to unregulated capitalism and barrons or move forward to regulated capitalism or some form of socialism. The idea is to raise up the majority of people, not every person for themselves, not screw over your fellow human being for your personal advantage.
I'd say since the high point of post WWII, we as a society in the U.S. have done our best to eradicate The New Deal and move back to reaching for magnificant wealth while screwing each other over.
So what would you call moving forward?
I'd say since the high point of post WWII, we as a society in the U.S. have done our best to eradicate The New Deal and move back to reaching for magnificant wealth while screwing each other over.
So what would you call moving forward?
anotherkenny
Apr 30, 04:40 PM
Tom was referring to this feature (http://arstechnica.com/business/news/2011/01/shows-over-how-hollywood-strong-armed-intel.ars).
"Intel... takes advantage of a new hardware module inside Sandy Bridge's GPU to enable the secure delivery of downloadable HD content to PCs, has been blasted as "DRM." But of course it's only a DRM-enabler�a hardware block that can store predistributed keys that the Sandy Bridge GPU uses to decrypt movies a frame at a time before they go out over the HDMI port."
It allows for secure playback of cloud movies, without the risk of pirating. Your own files aren't being scrutinized.
Clix Pix put the matte preference well in an old post (http://forums.macrumors.com/showthread.php?t=245491):
Go "matte.....easier on your eyes under all lighting conditions, more accurate representation of what will be printed or show on other people's monitors."
Photographers and people who don't like sparkled/ full of reflection monitors go with matte.
"Intel... takes advantage of a new hardware module inside Sandy Bridge's GPU to enable the secure delivery of downloadable HD content to PCs, has been blasted as "DRM." But of course it's only a DRM-enabler�a hardware block that can store predistributed keys that the Sandy Bridge GPU uses to decrypt movies a frame at a time before they go out over the HDMI port."
It allows for secure playback of cloud movies, without the risk of pirating. Your own files aren't being scrutinized.
Clix Pix put the matte preference well in an old post (http://forums.macrumors.com/showthread.php?t=245491):
Go "matte.....easier on your eyes under all lighting conditions, more accurate representation of what will be printed or show on other people's monitors."
Photographers and people who don't like sparkled/ full of reflection monitors go with matte.
gleepskip
Apr 20, 10:03 AM
If this is your biggest worry on people being able to track you...hmph.
Tinfoil hats are going to be all the rage here soon.
I didn't mention what my biggest worry is.
Tinfoil hats are going to be all the rage here soon.
I didn't mention what my biggest worry is.
AidenShaw
Sep 9, 10:01 AM
Good - now we won't have to wade through any arguments with fanbois who claim that the iMac is the "most powerful desktop on the planet"....
:D
:D
Eddyisgreat
Feb 26, 12:44 PM
Truth
//thread
//thread
infernohellion
Sep 26, 09:40 AM
Well what about Thailand where the law says GSM phones must not be exclusive to only one carrier (all must be unlocked)??
ksz
Jul 14, 11:35 AM
that was just noise.
Someday you might pick up the signal in that noise. :)
Either way I'll wait until the imac gets a desktop chip rather than a Laptop one.
Why?
Someday you might pick up the signal in that noise. :)
Either way I'll wait until the imac gets a desktop chip rather than a Laptop one.
Why?
No comments:
Post a Comment