seenew
Aug 27, 02:26 PM
Maybe there will be a new iMac launched with the new iPod in October.
Tom359
Apr 25, 04:14 PM
This is why we need a "loser pays" system.
This would get rid if the "I'm going to sue you so you pay money to go away becuase it's cheaper than paying the legal bills." Our system has been corrupted by these nuisance law suits.
This would get rid if the "I'm going to sue you so you pay money to go away becuase it's cheaper than paying the legal bills." Our system has been corrupted by these nuisance law suits.
bedifferent
Apr 10, 10:42 PM
I guess there's a lot of drama among the industry about Apple's refusal to release any kind of road map for FCP, not unlike their other products, and apparently a lot of people are starting to jump ship to Adobe's offerings. Everyone is pretty worried about this new overhaul because the guy who botched iMovie is the guy now in charge of FCP. I'm not into video editing, and I've never never used FCP or any product like it, but after hearing about all the drama and excitement surrounding this new overhaul I'm pretty stoked to see what happens.
My friend, who is a documentary film maker, was hired by Apple as a designer working with FCP engineers. In the past, we had differing views on FCP; I believed Apple was dropping it as well as other pro-sumer based products while she thought they wouldn't.
After recently speaking, and w/o breaking her NDA, she said she's disappointed. The project managers and engineers squabble a lot, and the designers (all almost film-makers and editors) aren't getting much input. According to her, Apple needs to fire the management and instate those focused on bringing the product to a new pro-sumer level. There appears to be a lot of mixed reviews, and that (as like Aperture) FCP is an attempt to bridge consumer and prosumer engines creating a big mess.
We'll see.
The guy who 'botched' iMovie is the same person that created Final Cut and continues to work on Final Cut. Randy Ubillos has been the head of Apple's video editing suites/applications for as long as I can remember.
�and according to those close to FCP development, therein lies the issue...
My friend, who is a documentary film maker, was hired by Apple as a designer working with FCP engineers. In the past, we had differing views on FCP; I believed Apple was dropping it as well as other pro-sumer based products while she thought they wouldn't.
After recently speaking, and w/o breaking her NDA, she said she's disappointed. The project managers and engineers squabble a lot, and the designers (all almost film-makers and editors) aren't getting much input. According to her, Apple needs to fire the management and instate those focused on bringing the product to a new pro-sumer level. There appears to be a lot of mixed reviews, and that (as like Aperture) FCP is an attempt to bridge consumer and prosumer engines creating a big mess.
We'll see.
The guy who 'botched' iMovie is the same person that created Final Cut and continues to work on Final Cut. Randy Ubillos has been the head of Apple's video editing suites/applications for as long as I can remember.
�and according to those close to FCP development, therein lies the issue...
Chundles
Jul 21, 06:02 AM
I'm going to start saving now for whatever the latest and greatest Mac desktop/monitor combination is in around 2010.
Then I'll post pics of me using it for Word, internet browsing and email. Maybe a bit of iTunes.
Those 16+ cores and lots of RAM will make Safari absolutely fly!!!
Then I'll post pics of me using it for Word, internet browsing and email. Maybe a bit of iTunes.
Those 16+ cores and lots of RAM will make Safari absolutely fly!!!
dernhelm
Aug 7, 03:53 PM
I am not entirely clear on what all CoreAnimation does and does not do, but I'm wondering if it and RIUI are not related at some level....
I voted Time Machine. I'm not even sure I'd really use it. But it's a neat idea, and the implementation looks to be nothing less than stunning.
These were my top two as well. I just didn't have quite enough information on how Core Animation is actually set up to vote for it. It's also hard to get real excited about a developer-enabling feature, but it could certainly lead to some cool apps. It's also great that they're eating their own dogfood and using it to code Time Machine.
Time machine was my vote mostly because of its wide appeal. This looks awesome, and if it is as effortless as it sounds, may even be a reason to buy some NAS storage and hook it up at home. The demo I saw was simply amazing.
Great work apple. Now get those Core 2 Duo chips in the iMac and I'll be all set. :)
I voted Time Machine. I'm not even sure I'd really use it. But it's a neat idea, and the implementation looks to be nothing less than stunning.
These were my top two as well. I just didn't have quite enough information on how Core Animation is actually set up to vote for it. It's also hard to get real excited about a developer-enabling feature, but it could certainly lead to some cool apps. It's also great that they're eating their own dogfood and using it to code Time Machine.
Time machine was my vote mostly because of its wide appeal. This looks awesome, and if it is as effortless as it sounds, may even be a reason to buy some NAS storage and hook it up at home. The demo I saw was simply amazing.
Great work apple. Now get those Core 2 Duo chips in the iMac and I'll be all set. :)
Evangelion
Sep 14, 01:14 AM
Didn't you get the memo, Hyperthreading was a joke.
At worst, it slowed performance down by few percent. At best, it gave substantial boost in performance. And multitasking-tests clearly benefitted from HyperThreading. That said, Intel dropped it, because it apparently consumed too much power. But we might see HT in some future Intel-CPU's at some point, you never know.
HT as such is not a bad idea. Sun UltraSparc T1 uses such a scheme extensively.
At worst, it slowed performance down by few percent. At best, it gave substantial boost in performance. And multitasking-tests clearly benefitted from HyperThreading. That said, Intel dropped it, because it apparently consumed too much power. But we might see HT in some future Intel-CPU's at some point, you never know.
HT as such is not a bad idea. Sun UltraSparc T1 uses such a scheme extensively.
0815
Apr 25, 02:01 PM
What I don't understand is even if Apple is tracking us, why did Steve Jobs simply lie about the claims, thats whats fishy about all this..
Because Apple is not tracking you. Apple does not get any of that data, they will never see or touch it. It is data that is stored locally on your phone out of reach from everyone except you. "Apple tracks you" would mean that the phone is sending the data 'home', but it doesn't. APPLE HAS NO IDEA WHERE THE F YOU ARE OR WERE (and they probably couldn't care less)
Because Apple is not tracking you. Apple does not get any of that data, they will never see or touch it. It is data that is stored locally on your phone out of reach from everyone except you. "Apple tracks you" would mean that the phone is sending the data 'home', but it doesn't. APPLE HAS NO IDEA WHERE THE F YOU ARE OR WERE (and they probably couldn't care less)
logandzwon
Apr 19, 02:51 PM
The First Commercial GUI
http://img62.imageshack.us/img62/5659/star1vg.gif
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
http://img217.imageshack.us/img217/7892/leopardpreviewdesktop4.jpghttp://img714.imageshack.us/img714/5733/xerox8010star.gif
-The Star was not a commercial product. Xerox didn't sell them. (Well eventually they did, but not as PCs. they were to be similar to what we'd call a terminal today.)
-the middle image is actually of an Apple Lisa. I think you were just showing as a comparison, but some people might think your saying it's a Star. It's not. It's a Lisa.
-Apple compensated Xerox for the ideas borrowed from the Star. SJ and the mac team were already working on the GUI before any of them ever saw the Star though. Also, Macintosh 1 wasn't a copy of the Star. In fact a lot of the stables of a modern GUI today were innovated by Apple for the Macintosh.
http://img62.imageshack.us/img62/5659/star1vg.gif
Xerox's Star workstation was the first commercial implementation of the graphical user interface. The Star was introduced in 1981 and was the inspiration for the Mac and all the other GUIs that followed.
http://img217.imageshack.us/img217/7892/leopardpreviewdesktop4.jpghttp://img714.imageshack.us/img714/5733/xerox8010star.gif
-The Star was not a commercial product. Xerox didn't sell them. (Well eventually they did, but not as PCs. they were to be similar to what we'd call a terminal today.)
-the middle image is actually of an Apple Lisa. I think you were just showing as a comparison, but some people might think your saying it's a Star. It's not. It's a Lisa.
-Apple compensated Xerox for the ideas borrowed from the Star. SJ and the mac team were already working on the GUI before any of them ever saw the Star though. Also, Macintosh 1 wasn't a copy of the Star. In fact a lot of the stables of a modern GUI today were innovated by Apple for the Macintosh.
SevenInchScrew
Aug 19, 09:21 AM
I'm 100% sure the GT site says all the cars were remodeled for the ps3, as in not the ps2 cars.
http://us.gran-turismo.com/us/news/d5247.html
recreated
As in not copypasta'd over from gt4.
All that I get from that quote is that they are using older models, but that they will, obviously, be rendered in the new GT5 engine. So, the marketing team can say all they want, but actual screen shots of Standard� cars do not show much improvement, if any at all, resolution increase notwithstanding.
Do we know if all cars have fully modelled interiors or if thats just for the luxury cars?
No, the only cars that have an interior view are the Premium� models. From NSB's link above...
Standard cars do not support vehicle interior camera views.
http://us.gran-turismo.com/us/news/d5247.html
recreated
As in not copypasta'd over from gt4.
All that I get from that quote is that they are using older models, but that they will, obviously, be rendered in the new GT5 engine. So, the marketing team can say all they want, but actual screen shots of Standard� cars do not show much improvement, if any at all, resolution increase notwithstanding.
Do we know if all cars have fully modelled interiors or if thats just for the luxury cars?
No, the only cars that have an interior view are the Premium� models. From NSB's link above...
Standard cars do not support vehicle interior camera views.
sososowhat
Sep 13, 09:50 AM
One could run a Folding@Home process on each core :D
samcraig
Apr 25, 04:16 PM
You have a RIGHT? Really? And where does that RIGHT come from? The only right you have is the right to choose another product if you don't like something about the one you're using.
Stop whining. The phone doesn't even track you. As others have pointed out, the data is cell tower based, not GPS. The phone only logs the same kind of information your cell company already logs.
Normally I would argue that the customer doesn't have a right to a lot of things. But in this case - if you bought a device and it is tracking you (I'm not saying it is or it isn't) - the customer does have a right to know.
This (sort of) reminds me of how now your are legally allowed to get a free credit report once a year to determine whether or not it's correct. Companies used to make a fortune charging for something that people, inherently had the right to know.
Stop whining. The phone doesn't even track you. As others have pointed out, the data is cell tower based, not GPS. The phone only logs the same kind of information your cell company already logs.
Normally I would argue that the customer doesn't have a right to a lot of things. But in this case - if you bought a device and it is tracking you (I'm not saying it is or it isn't) - the customer does have a right to know.
This (sort of) reminds me of how now your are legally allowed to get a free credit report once a year to determine whether or not it's correct. Companies used to make a fortune charging for something that people, inherently had the right to know.
amin
Sep 14, 10:53 PM
I have noticed this emphasis as well; not being an expert on this issue myself though, would you care to shed light on how their coverage is an exaggeration and why we shouldn't be worried about it?
I am no expert, and I am not denying that this issue matters. However, I see no cause for concern unless someone provides some decent evidence that it matters. It strikes me as odd that they (at AnandTech) put so much emphasis on explaining the theory behind a "problem" without making any competent effort at illustrating an example of the problem. When you go to configure a Mac Pro, the Apple page says the following about memory: "Mac Pro uses 667MHz DDR2 fully buffered ECC memory, a new industry-standard memory technology that allows for more memory capacity, higher speeds, and better reliability. To take full advantage of the 256-bit wide memory architecture, four or more FB-DIMMs should be installed in Mac Pro." Yet AnandTech chose a 1GB x 2 RAM arrangement to compare the Core 2 Extreme and Xeon processors. Using this setup, which effectively cripples the Mac Pro memory system, they find it to be at worst 10% slower than the Conroe Extreme (in a single non real world usage benchmark). Meanwhile in any comparison that utilizes the four cores, the quad Xeon whoops ass by a large margin.
I am no expert, and I am not denying that this issue matters. However, I see no cause for concern unless someone provides some decent evidence that it matters. It strikes me as odd that they (at AnandTech) put so much emphasis on explaining the theory behind a "problem" without making any competent effort at illustrating an example of the problem. When you go to configure a Mac Pro, the Apple page says the following about memory: "Mac Pro uses 667MHz DDR2 fully buffered ECC memory, a new industry-standard memory technology that allows for more memory capacity, higher speeds, and better reliability. To take full advantage of the 256-bit wide memory architecture, four or more FB-DIMMs should be installed in Mac Pro." Yet AnandTech chose a 1GB x 2 RAM arrangement to compare the Core 2 Extreme and Xeon processors. Using this setup, which effectively cripples the Mac Pro memory system, they find it to be at worst 10% slower than the Conroe Extreme (in a single non real world usage benchmark). Meanwhile in any comparison that utilizes the four cores, the quad Xeon whoops ass by a large margin.
PeterQVenkman
Apr 5, 08:12 PM
Nobody's using Blu-Ray, in my experience.
There is a whole thread about that, though. Don't read it.
Perhaps a little hasty of me, I was simply meant to say that in my experience I've not ever been required to deliver anything on Blu-Ray, and that to my mind it was a purely consumer format.
I've been to quite a few film festivals that take entries on Blu-Ray.
Apple has two mountains to climb: 1) to keep up with their competition where they used to lead. (2) They have to convince users that the mac as a pro platform is a good investment.
There is a whole thread about that, though. Don't read it.
Perhaps a little hasty of me, I was simply meant to say that in my experience I've not ever been required to deliver anything on Blu-Ray, and that to my mind it was a purely consumer format.
I've been to quite a few film festivals that take entries on Blu-Ray.
Apple has two mountains to climb: 1) to keep up with their competition where they used to lead. (2) They have to convince users that the mac as a pro platform is a good investment.
NightFox
Apr 19, 01:37 PM
why? iphones outselling itouches by so much makes sense to me.
Just really basing it on my own experience - I'm the only one of my close friends/family to own an iPhone, but I can count 5 iPod Touches in that same group. Also thought their would be a lot of iPod Touches owned by children rather than iPhones.
Just really basing it on my own experience - I'm the only one of my close friends/family to own an iPhone, but I can count 5 iPod Touches in that same group. Also thought their would be a lot of iPod Touches owned by children rather than iPhones.
KnightWRX
Apr 6, 07:26 PM
Totally depends on what tools you are using. Sure, when I'm at home working on a light webapp running nothing but Emacs, Chrome, Postgres, and using, for example, Python as my server-side language, 4GB of RAM is more than enough, hell I could get by with 2GB no problem
You'd need 2 GBs for that ? My Linux server with about 384 MB of RAM runs that web/db environnement without breaking a sweat, with a load average of about 0.1, and that's not even a quarter of what runs off of it.
No, seriously, people overestimate their computing ressource needs these days. Xcode is pretty light, Eclipse ran on computers from 10 years ago, so did Netbeans. Tomcat has been around and hasn't changed much from its 5.0 release, back in the early 2000s.
The MBA is fine for running the tools you describe and would make a fine software development station for the needs you expose, don't ever doubt that.
By "run everything", you can't possibly mean run games at "higher than medium" settings, nor edit lots of HD footage in something like Final Cut Pro. Though that's not what YOU use YOUR MacBook Air for
I'd argue the needs I described are shared by much more people that the needs you claim aren't filled by a MBA. I doubt Final Cut Pro movie editing is anything but a small niche of what computer buyers do with their machines and "higher than medium" settings is not something I use to describe gaming. I value games for their playability, not how they look on my screen. Of course, I come from the era of EGA graphics and Adlib sound systems, when games were about gameplay.
Still, the MBA does fine with iMovie and I can play Civilization IV at full screen on my external monitor of 2048x1156 pixels without breaking a sweat. It is a very capable machine, contrary to what you believe. Use one and see for yourself before you diss the thing. I can understand why you wouldn't be interested in one, I can't however understand the venom you spit at the thing.
please, please, P...L...E...A...S...E - Can we have an integrated Cellular data chip
Get a USB adapter. That way, your 2000$ laptop won't be tied to a single carrier the way Apple does 3G in its devices. I'm fine with my iPhone and tethering, I'd rather Apple sell the MBA on the cheap and leave the 3G option up to the users.
It's not like you can't use a MBA over 3G networks right this day (or any other Mac for that matter).
Wait, so MacBook Air has a TN panel? That makes no sense, the iPad 2 has an IPS panel...
Anyway, I'd like to see backlit keys and an IPS display before I buy a MBA :cool:
Very, very few laptops have IPS displays. The only one that comes to mind is the HP Elitebook with the DreamColor screen option (the standard screen on it is a TN panel).
Apple does not install Flash Player on newer machines, so this is not a problem.
Try youtube.com/html5 (http://www.youtube.com/html5) or ClickToFlash (http://rentzsch.github.com/clicktoflash/) or other HTML5-Safari extensions (http://www.macupdate.com/find/mac/html5%20extension)!
Youtube is not the only source of content out there and until all video provider sites are HTML5, computers without VDA framework support will be slower, run hotter and have lesser battery life than those with VDA support.
And HTML5 won't be on all video sites until you can graft DRM on top of it. Think of the paid-for streaming providers like Hulu.
BTW, my MBA runs Flash without any problems. I don't need Apple to pre-install it for me.
You obviously know nothing about OpenCL (http://en.wikipedia.org/wiki/OpenCL). OpenCL is not hardware dependent. OpenCL programs can run even on old 300 MHz PowerPC processors, if someone writes a OpenCL-compiler for this platform.
And you obvioulsy don't understand what a GPGPU API is for. What good is running code through an API whose purpose is to offload your CPU by using ... your CPU. It makes no sense to emulate OpenCL in software, other than providing OpenCL on computers without a hardware implementation.
In the end, you haven't achieved the purpose of OpenCL, which is to offload the CPU, since you haven't offloaded the CPU at all.
The point is, the Intel 3000 HD on Mac OS X cannot run OpenCL code, so it's up to the CPU to do it.
You failed to even counter my points. Your attempt is only about dismissal, which proves my points are very valid.
You'd need 2 GBs for that ? My Linux server with about 384 MB of RAM runs that web/db environnement without breaking a sweat, with a load average of about 0.1, and that's not even a quarter of what runs off of it.
No, seriously, people overestimate their computing ressource needs these days. Xcode is pretty light, Eclipse ran on computers from 10 years ago, so did Netbeans. Tomcat has been around and hasn't changed much from its 5.0 release, back in the early 2000s.
The MBA is fine for running the tools you describe and would make a fine software development station for the needs you expose, don't ever doubt that.
By "run everything", you can't possibly mean run games at "higher than medium" settings, nor edit lots of HD footage in something like Final Cut Pro. Though that's not what YOU use YOUR MacBook Air for
I'd argue the needs I described are shared by much more people that the needs you claim aren't filled by a MBA. I doubt Final Cut Pro movie editing is anything but a small niche of what computer buyers do with their machines and "higher than medium" settings is not something I use to describe gaming. I value games for their playability, not how they look on my screen. Of course, I come from the era of EGA graphics and Adlib sound systems, when games were about gameplay.
Still, the MBA does fine with iMovie and I can play Civilization IV at full screen on my external monitor of 2048x1156 pixels without breaking a sweat. It is a very capable machine, contrary to what you believe. Use one and see for yourself before you diss the thing. I can understand why you wouldn't be interested in one, I can't however understand the venom you spit at the thing.
please, please, P...L...E...A...S...E - Can we have an integrated Cellular data chip
Get a USB adapter. That way, your 2000$ laptop won't be tied to a single carrier the way Apple does 3G in its devices. I'm fine with my iPhone and tethering, I'd rather Apple sell the MBA on the cheap and leave the 3G option up to the users.
It's not like you can't use a MBA over 3G networks right this day (or any other Mac for that matter).
Wait, so MacBook Air has a TN panel? That makes no sense, the iPad 2 has an IPS panel...
Anyway, I'd like to see backlit keys and an IPS display before I buy a MBA :cool:
Very, very few laptops have IPS displays. The only one that comes to mind is the HP Elitebook with the DreamColor screen option (the standard screen on it is a TN panel).
Apple does not install Flash Player on newer machines, so this is not a problem.
Try youtube.com/html5 (http://www.youtube.com/html5) or ClickToFlash (http://rentzsch.github.com/clicktoflash/) or other HTML5-Safari extensions (http://www.macupdate.com/find/mac/html5%20extension)!
Youtube is not the only source of content out there and until all video provider sites are HTML5, computers without VDA framework support will be slower, run hotter and have lesser battery life than those with VDA support.
And HTML5 won't be on all video sites until you can graft DRM on top of it. Think of the paid-for streaming providers like Hulu.
BTW, my MBA runs Flash without any problems. I don't need Apple to pre-install it for me.
You obviously know nothing about OpenCL (http://en.wikipedia.org/wiki/OpenCL). OpenCL is not hardware dependent. OpenCL programs can run even on old 300 MHz PowerPC processors, if someone writes a OpenCL-compiler for this platform.
And you obvioulsy don't understand what a GPGPU API is for. What good is running code through an API whose purpose is to offload your CPU by using ... your CPU. It makes no sense to emulate OpenCL in software, other than providing OpenCL on computers without a hardware implementation.
In the end, you haven't achieved the purpose of OpenCL, which is to offload the CPU, since you haven't offloaded the CPU at all.
The point is, the Intel 3000 HD on Mac OS X cannot run OpenCL code, so it's up to the CPU to do it.
You failed to even counter my points. Your attempt is only about dismissal, which proves my points are very valid.
arkitect
Mar 1, 05:13 AM
...
...
...
...
...
...
Fascinating as this insight into a mediaeval mind is, please do remember to use the multi-quote.
http://images.macrumors.com/vb/images/buttons/multiquote_off.gif
...
...
...
...
...
Fascinating as this insight into a mediaeval mind is, please do remember to use the multi-quote.
http://images.macrumors.com/vb/images/buttons/multiquote_off.gif
playaj82
Aug 7, 03:37 PM
If the rumor sites were right....
Mac Pro
Leopard
iPhone
Core 2 Duo
iMac
Tablet, etc...
the keynote would have been 6 hours.
I'm glad they took their time with Leopard and highlighted some neat new and much needed additions to tiger.
Mac Pro
Leopard
iPhone
Core 2 Duo
iMac
Tablet, etc...
the keynote would have been 6 hours.
I'm glad they took their time with Leopard and highlighted some neat new and much needed additions to tiger.
gauriemma
Aug 26, 08:12 AM
No, because different versions of the ranges were initially posted only recently has it been clarified...get with the program and stop trying to be a smartass
Get with what program? I went to the support site on the day the recall was announced, checked to see if my serial number was in the range, it wasn't, and I went on with my life. Just to be safe, I even checked back a couple days later, and the ranges were still the same as the first time I checked.
I had to do the same thing wheh I was checking out our Dell laptops at the office. It's really not that difficult a concept. I think some people just like to have something to complain about.
Get with what program? I went to the support site on the day the recall was announced, checked to see if my serial number was in the range, it wasn't, and I went on with my life. Just to be safe, I even checked back a couple days later, and the ranges were still the same as the first time I checked.
I had to do the same thing wheh I was checking out our Dell laptops at the office. It's really not that difficult a concept. I think some people just like to have something to complain about.
iliketyla
Mar 31, 08:46 PM
This is where the Android "community" is going to split.
The ones we've heard from today don't give a crap about "open" or "closed" or Google or anything else other than the fact that Android is not Apple and is stealing some sales from Apple. They'll defend whatever Google does, because all they want is a platform that's not by Apple to take over the mobile space.
The true believers in the "open" propaganda, as ridiculous as it is and as untrue as it's always been, are probably still in a state of shock. By tomorrow they'll split into two warring camps. One will defend everything Google does because they perceive—wrongly of course—that Android is still in some indefinable way more open than iOS, and they'll blow that little invisible kernel of "openness" up until that's all they can see.
The other camp will be viciously angry at Google's betrayal of the True Religion™ and will be flailing around for some other messiah to deliver them from the "Walled Garden" of Apple and now, Android. These are the people who were saying the other day that "Motorola could rot" with their own OS.
Any suggestions on who the zealots will turn to in their hour of despair? I honestly can't think of a candidate, but then I'm not nuts—at least not that way.
Yeah! That's what'll happen!
Or they'll do further research and realize that the implications in this SINGLE ARTICLE might not be 100% true.
To the everyday user this means NOTHING as they have no knowledge of what open truly means, and therefore can't take advantage of it.
To the users who actually have the knowhow to utilize open source operating systems, this might mean a minor hinderance, but not a complete game changer.
And for clarification, the former is the vast majority.
Did no one notice the obvious bias in this article? It's slanted, and the author clearly thinks that Google has been wrong this entire time.
The ones we've heard from today don't give a crap about "open" or "closed" or Google or anything else other than the fact that Android is not Apple and is stealing some sales from Apple. They'll defend whatever Google does, because all they want is a platform that's not by Apple to take over the mobile space.
The true believers in the "open" propaganda, as ridiculous as it is and as untrue as it's always been, are probably still in a state of shock. By tomorrow they'll split into two warring camps. One will defend everything Google does because they perceive—wrongly of course—that Android is still in some indefinable way more open than iOS, and they'll blow that little invisible kernel of "openness" up until that's all they can see.
The other camp will be viciously angry at Google's betrayal of the True Religion™ and will be flailing around for some other messiah to deliver them from the "Walled Garden" of Apple and now, Android. These are the people who were saying the other day that "Motorola could rot" with their own OS.
Any suggestions on who the zealots will turn to in their hour of despair? I honestly can't think of a candidate, but then I'm not nuts—at least not that way.
Yeah! That's what'll happen!
Or they'll do further research and realize that the implications in this SINGLE ARTICLE might not be 100% true.
To the everyday user this means NOTHING as they have no knowledge of what open truly means, and therefore can't take advantage of it.
To the users who actually have the knowhow to utilize open source operating systems, this might mean a minor hinderance, but not a complete game changer.
And for clarification, the former is the vast majority.
Did no one notice the obvious bias in this article? It's slanted, and the author clearly thinks that Google has been wrong this entire time.
phatpat88
Jul 15, 12:43 AM
So excited... How come no FW800 infront? thats a little crazy no?
Right now the only device I use for FW800 are Hard drives... I would rather have a 2nd USB 2.0 in front than the 800
Right now the only device I use for FW800 are Hard drives... I would rather have a 2nd USB 2.0 in front than the 800
Rooskibar03
Apr 11, 01:06 PM
Guess this isn't as bad as I would like it to be. ATT moved my upgrade date to 12/3 after I lowered my pricing plan.
Bummer.
Bummer.
notjustjay
Apr 27, 10:33 AM
Really? So you're telling me that the location saved, of the cell tower 100 miles away, is actually really MY location?
Wow!
I think it's not as bad as what the media would have you believe, BUT it is worse than what Apple wants you to think.
Sure, cell towers could be up to 100 miles away. And when I ran the mapping tool and plotted my locations, and zoom in far enough, I do indeed see a grid of cell towers as opposed to actual locations where I've been standing. All anyone could know is that I've been "somewhere" in the vicinity.
(And this isn't new. Some time ago I came upon a car crash and called 911 on my cell phone to report it. They were able to get the location to send emergency services just by where I was calling from. It wasn't 100% accurate -- they asked if I was near a major intersection and I told them it was about a block from there.)
However, if it's also tracking wifi hotspots, those can pinpoint you pretty closely. Most people stay within 30-50 feet of their wireless router, and the ones you spend the most time connected to will be the ones at home, at work, and and at your friends' houses.
Wow!
I think it's not as bad as what the media would have you believe, BUT it is worse than what Apple wants you to think.
Sure, cell towers could be up to 100 miles away. And when I ran the mapping tool and plotted my locations, and zoom in far enough, I do indeed see a grid of cell towers as opposed to actual locations where I've been standing. All anyone could know is that I've been "somewhere" in the vicinity.
(And this isn't new. Some time ago I came upon a car crash and called 911 on my cell phone to report it. They were able to get the location to send emergency services just by where I was calling from. It wasn't 100% accurate -- they asked if I was near a major intersection and I told them it was about a block from there.)
However, if it's also tracking wifi hotspots, those can pinpoint you pretty closely. Most people stay within 30-50 feet of their wireless router, and the ones you spend the most time connected to will be the ones at home, at work, and and at your friends' houses.
mobilehavoc
Apr 6, 03:32 PM
Congrats, you will be able to play with the handful of apps designed for it.
;)
You're absolutely right. Better than the junk in the app store. At the end of the day there aren't that many QUALITY apps on ipad either. I know because I have one.
;)
You're absolutely right. Better than the junk in the app store. At the end of the day there aren't that many QUALITY apps on ipad either. I know because I have one.
KingYaba
Aug 27, 10:45 AM
Maybe an x1800. We all just have to wait and see. :)
No comments:
Post a Comment