bpaluzzi
Apr 28, 01:06 PM
No I understand quite well. Your example leads me to believe you don't.
People didn't wear, display, or carry their internet connection in public, they did the iPod.
Why do you think White headphones, and MP3 players of similar look / shape & form factor became popular (from other manufacturers mind you) after the iPod became popular? Likely because it was a popular look / gadget that many people wanted.
A fad rarely includes items of technology, but sometimes it does. The subject of the iPod being a fad isn't something just I created / think, it has been discussed for a few years now, especially since the introduction of the iPhone.
Cheers
Yeah, you still don't understand what a fad is. Wow.
People didn't wear, display, or carry their internet connection in public, they did the iPod.
Why do you think White headphones, and MP3 players of similar look / shape & form factor became popular (from other manufacturers mind you) after the iPod became popular? Likely because it was a popular look / gadget that many people wanted.
A fad rarely includes items of technology, but sometimes it does. The subject of the iPod being a fad isn't something just I created / think, it has been discussed for a few years now, especially since the introduction of the iPhone.
Cheers
Yeah, you still don't understand what a fad is. Wow.
Swift
Mar 18, 03:44 PM
DRM is a big, fat target for every hacker in the world. I doubt very much it will ever be perfect. It can't be. It would be easy to encrypt music so badly that you couldn't play it. To allow legit users to listen to it means the key is already there. The hacker just finds it.
ezekielrage_99
Sep 25, 11:37 PM
Exactly... Now I have to wait even longer to jump into the Mac foray... I'm holding on until these 8-ways come out... I hope it is soon!
But seriously how many cores does anyone REALLY need?
Still the more the merrier I say :cool:
But seriously how many cores does anyone REALLY need?
Still the more the merrier I say :cool:
Howdr
Mar 18, 08:35 AM
OMG you still done get it:
Let's try explaining it this way...
When you subscribe to cable, you pick a package that provides you with the channels that you want. There are various packages, but ultimately it's all just video streaming over a cable (bits in this day and age, not analog)...
Based on yours and others arguements, why can't we all just pay for basic cable and get all 500+ channels plus the premium channels for free? Very simply, you're paying for a package with specific features....
No no, as long as you abide by the amount of data in the plan it should not matter how you use it.
You can't steal what you paid for, you buy 100 cable channels that is what you get and use
You buy 2gb and use 1gb you have used 1gb no matter if its on the phone or laptop. 1gb= 1gb
With your cellular service, you chose a package that meets your needs. You have 3 options for data plans at this point, well, 4 technically...
1) Your grandfathered unlimited plan
2) 250mb
3) Data Pro 2GB
4) Data Pro 2GB + Tethering 2GB for a total of 4GB....
Ok? the tethering give you 2gb for the money I see that and I have read the tethering and Data pro are added to total 4gb for the charge. So you and At&t prove my point thank you! Data=Data, they add it together and it is the same.
Tethering is not the same as using the data on your device, essentially tethering is using your phone as a modem. You data plan (which I'm assuming is either unlimited or 250mb) does not include the feature of using your phone as a modem, that's what the extra charge is for....
If you want to tether, you need to pay for the appropriate package. Just like if you want HBO, Showtime, or HDTV you need to pay for the appropriate cable package...
LOL no its the same use of Data as on the phone.
Tethering does not do something different to AT&t, its just using Data
you may not understand how Data is used from the source but I assure you there is no difference to AT&t when you tether and when you surf YOUTUBE on the phone.
To At&t Data=Data and its been their words not mine every time its printed by them.
So far I have not seen an argument that proves otherwise.:rolleyes:
Let's try explaining it this way...
When you subscribe to cable, you pick a package that provides you with the channels that you want. There are various packages, but ultimately it's all just video streaming over a cable (bits in this day and age, not analog)...
Based on yours and others arguements, why can't we all just pay for basic cable and get all 500+ channels plus the premium channels for free? Very simply, you're paying for a package with specific features....
No no, as long as you abide by the amount of data in the plan it should not matter how you use it.
You can't steal what you paid for, you buy 100 cable channels that is what you get and use
You buy 2gb and use 1gb you have used 1gb no matter if its on the phone or laptop. 1gb= 1gb
With your cellular service, you chose a package that meets your needs. You have 3 options for data plans at this point, well, 4 technically...
1) Your grandfathered unlimited plan
2) 250mb
3) Data Pro 2GB
4) Data Pro 2GB + Tethering 2GB for a total of 4GB....
Ok? the tethering give you 2gb for the money I see that and I have read the tethering and Data pro are added to total 4gb for the charge. So you and At&t prove my point thank you! Data=Data, they add it together and it is the same.
Tethering is not the same as using the data on your device, essentially tethering is using your phone as a modem. You data plan (which I'm assuming is either unlimited or 250mb) does not include the feature of using your phone as a modem, that's what the extra charge is for....
If you want to tether, you need to pay for the appropriate package. Just like if you want HBO, Showtime, or HDTV you need to pay for the appropriate cable package...
LOL no its the same use of Data as on the phone.
Tethering does not do something different to AT&t, its just using Data
you may not understand how Data is used from the source but I assure you there is no difference to AT&t when you tether and when you surf YOUTUBE on the phone.
To At&t Data=Data and its been their words not mine every time its printed by them.
So far I have not seen an argument that proves otherwise.:rolleyes:
bugfaceuk
Apr 9, 09:26 AM
Heh, you put "REAL" in caps. :p
If you don't believe me, there's plenty of history to read. Just go look at the following industries that were disrupted by technology...
flower floral pattern
The pattern requires the use
print rangoli pattern
flower patterns.
Flower Sorting Pattern (two
flower pattern : Handbags
use the flower top pattern
Use flower rice werangoli patterns using colored rice powder, stoneflower rangoli Are simpleunder known as patterns, pictures, photos, diwali latest
Flower Tote Crochet Pattern
appeal to floral patterns.
Print. Pattern of several
Hawaiian+flower+pattern
patterns for sale
patterns 127 Print,
If you don't believe me, there's plenty of history to read. Just go look at the following industries that were disrupted by technology...
milo
Jul 13, 11:17 AM
Apple will offer a New Form Factor 64-bit Dual-Core Conroe Mini-Tower whether or not a single chip Woodie is in the lineup. They'll have no choice.
Not necessarily. They could also just put the conroe in the base model with the same form factor, although they probably wouldn't be able to get it as cheap. I don't really care if they go with the mini form factor or not as long as the price is low enough.
the single xeon configs i was refering to were netburst based ones.
(snip)
apple tried the powermac mini as it were and you did not buy it, it was called the g4 cube.
That's a $300 difference in list price. Even if apple pays half of that, it's a significant amount, not to mention that the difference goes higher the more ram you buy.
Sure, it makes sense for companies to offer a single woodcrest config IN ADDITION to conroe configs. It mostly makes sense for users who want to add the second chip themselves in the future. But all those companies also will sell conroe configs, and they will be cheaper. It just doesn't make sense to sell single woodcrest as a substitute for conroe, apple would likely be the only company doing that.
And the cube failed because it was simply outrageously overpriced (I would NOT consider it "powermac" by any stretch of the imagination, but it still cost almost as much as the full towers). They brought it back as the mini which has sold very well and demonstrated that people DO want smaller, cheaper alternatives.
Not necessarily. They could also just put the conroe in the base model with the same form factor, although they probably wouldn't be able to get it as cheap. I don't really care if they go with the mini form factor or not as long as the price is low enough.
the single xeon configs i was refering to were netburst based ones.
(snip)
apple tried the powermac mini as it were and you did not buy it, it was called the g4 cube.
That's a $300 difference in list price. Even if apple pays half of that, it's a significant amount, not to mention that the difference goes higher the more ram you buy.
Sure, it makes sense for companies to offer a single woodcrest config IN ADDITION to conroe configs. It mostly makes sense for users who want to add the second chip themselves in the future. But all those companies also will sell conroe configs, and they will be cheaper. It just doesn't make sense to sell single woodcrest as a substitute for conroe, apple would likely be the only company doing that.
And the cube failed because it was simply outrageously overpriced (I would NOT consider it "powermac" by any stretch of the imagination, but it still cost almost as much as the full towers). They brought it back as the mini which has sold very well and demonstrated that people DO want smaller, cheaper alternatives.
everettmarshall
Apr 13, 08:38 AM
Not having seen FCPX first hand I will completely withhold judgement on the app until I do.
But I will make the observation that it seems for some, the price point is what makes this app "less" pro. The fact that more people can get it and call themselves video or film editors when they are no more an editor than someone who buys a tool set at Lowe's is a mechanic.
Having the tools doesn't mean you know how to use them - but with more people having the tools thinking they do - the value of those that REALLY do can be affected if it appears that "anyone" can do it.
But I will make the observation that it seems for some, the price point is what makes this app "less" pro. The fact that more people can get it and call themselves video or film editors when they are no more an editor than someone who buys a tool set at Lowe's is a mechanic.
Having the tools doesn't mean you know how to use them - but with more people having the tools thinking they do - the value of those that REALLY do can be affected if it appears that "anyone" can do it.
AJsAWiz
Sep 18, 07:37 AM
Add me to the excessive dropped call list, keep getting them randomly over the passed two weeks at my house. I'm going to call AT&T today, hopefully score a MicroCell.
Well, I've been calling AT&T continuously (have had this problem for about a year now) and have gone the entire gamut of troubleshooting solutions (some I've done twice) but the dropped calls and weak signals prevail. AT&T wants to accept zero responsibility for these issues nor do they seem to be either willing or able to fix the dropped call/weak signal issues.
SO, in a nutshell . . . . good luck with that. Hope you are more successful in your attempts. Then you could come back and share the magic formula :)
Well, I've been calling AT&T continuously (have had this problem for about a year now) and have gone the entire gamut of troubleshooting solutions (some I've done twice) but the dropped calls and weak signals prevail. AT&T wants to accept zero responsibility for these issues nor do they seem to be either willing or able to fix the dropped call/weak signal issues.
SO, in a nutshell . . . . good luck with that. Hope you are more successful in your attempts. Then you could come back and share the magic formula :)
puma1552
Mar 12, 06:16 AM
Ugh, just as soon as I had posted...
Beg to differ. You've been praising Japanese nuclear power plant construction as being superior to the impoverished Soviet ones that go into meltdown. Well, we've all now seen your argument for the 'testament to building codes' by 'experts on Japanese nuclear regulations' totally explode and is now lying in rubble. Unless of course you now insist that the building exploding and cllapsing on the core is part of the building codes? ;):
I haven't "been praising" their construction, I "praised" their construction in one post, if you can even call it that. The Japanese know what they are doing by and large in many of the things they do; that's why Japan has had 30% of its power delivered via well-developed, and well-understood nuclear sources for years, while the west is still outright paranoid of so much as a mention of the word nuclear.
The only thing I did was compare it to Chernobyl, or rather defend against it, as it certainly is not Chernobyl, and was built to higher standards than anything in the USSR during that time, that meaning Chernobyl.
You think they built the plant 40 years ago and have done literally nothing in terms of maintenance and/or upgrades since that time? You don't think regulatory statutes and codes have changed during the time, and they've had to comply with those and be subject to normal regulatory inspections that meet todays 2011 safety and energy protocols?
Just because the plant was built 40 years ago, doesn't mean it is the same plant as what was built 40 years ago. Trust me, I was and am full aware that the plant is older than Chernobyl. But the difference is that Chernobyl ate it during a time of 1980's USSR safety standards, when the international nuclear community wasn't nearly as effective as it is today. Today's plant may be 10 years older than Chernobyl, but it's 30 years further up to date. Nuclear plants in the first world don't exactly get the "build it and forget it" treatment.
I don't want to argue about this, because it's pointless since we are all hoping for the best and fearing the worst. But I do know a thing or two, and it gets tiring correcting false information proliferating throughout thanks to a bunch of people in the media who have no technical training and haven't a clue about anything. The Japan forums are ablaze with misinformation.
Nuclear power is generally pretty safe, and it's a shame the west hasn't been able to embrace it, IMO. That isn't to say tragic accidents can't happen, as they can, but by and large they are extremely, extremely rare.
Beg to differ. You've been praising Japanese nuclear power plant construction as being superior to the impoverished Soviet ones that go into meltdown. Well, we've all now seen your argument for the 'testament to building codes' by 'experts on Japanese nuclear regulations' totally explode and is now lying in rubble. Unless of course you now insist that the building exploding and cllapsing on the core is part of the building codes? ;):
I haven't "been praising" their construction, I "praised" their construction in one post, if you can even call it that. The Japanese know what they are doing by and large in many of the things they do; that's why Japan has had 30% of its power delivered via well-developed, and well-understood nuclear sources for years, while the west is still outright paranoid of so much as a mention of the word nuclear.
The only thing I did was compare it to Chernobyl, or rather defend against it, as it certainly is not Chernobyl, and was built to higher standards than anything in the USSR during that time, that meaning Chernobyl.
You think they built the plant 40 years ago and have done literally nothing in terms of maintenance and/or upgrades since that time? You don't think regulatory statutes and codes have changed during the time, and they've had to comply with those and be subject to normal regulatory inspections that meet todays 2011 safety and energy protocols?
Just because the plant was built 40 years ago, doesn't mean it is the same plant as what was built 40 years ago. Trust me, I was and am full aware that the plant is older than Chernobyl. But the difference is that Chernobyl ate it during a time of 1980's USSR safety standards, when the international nuclear community wasn't nearly as effective as it is today. Today's plant may be 10 years older than Chernobyl, but it's 30 years further up to date. Nuclear plants in the first world don't exactly get the "build it and forget it" treatment.
I don't want to argue about this, because it's pointless since we are all hoping for the best and fearing the worst. But I do know a thing or two, and it gets tiring correcting false information proliferating throughout thanks to a bunch of people in the media who have no technical training and haven't a clue about anything. The Japan forums are ablaze with misinformation.
Nuclear power is generally pretty safe, and it's a shame the west hasn't been able to embrace it, IMO. That isn't to say tragic accidents can't happen, as they can, but by and large they are extremely, extremely rare.
gorgeousninja
Apr 21, 08:58 AM
What's wrong with that? I may not own a particular product but like being in X products forums to learn about it.
in your case 'learning about a product' seems to revolve around telling everyone how misguided they are.
maybe you need to look up the definition of learning.
in your case 'learning about a product' seems to revolve around telling everyone how misguided they are.
maybe you need to look up the definition of learning.
diamond.g
Apr 21, 09:12 AM
Or you know, turn locations off. Hard to look at it when it hasn't been tracking. Skype did a good job of quickly fixing the bug, but that is hardly the case in EVERY app out there. It was one example a potential flaw, of which there have been many on Android devices.
Have we established that turning off location services actually disables this "feature"?
Have we established that turning off location services actually disables this "feature"?
nixd2001
Oct 12, 09:48 AM
Originally posted by MacCoaster
javajedi's Java and Cocoa/Objective-C code has been available here (http://members.ij.net/javajedi) for a couple of days. My C# port is available for examination if you e-mail me.
I was thinking of the x86 and PPC assembler produced for the core loops. I could bung the C through GCC and get some assembler on my windy tunnels, true, but I'm not geared up to do the Windows side of things.
javajedi's Java and Cocoa/Objective-C code has been available here (http://members.ij.net/javajedi) for a couple of days. My C# port is available for examination if you e-mail me.
I was thinking of the x86 and PPC assembler produced for the core loops. I could bung the C through GCC and get some assembler on my windy tunnels, true, but I'm not geared up to do the Windows side of things.
strale
Mar 20, 07:00 PM
Music is too expensive, and the music industry doesn't do anything to fill the needs of the consumer - a aac file doesn't cost a penny to produce, unlike the CD, so why is a aac file so expensive? The music industry doesn't allow to sell mp3's - which is the format most likely to be accepted by the comsumer. At least Sony know now that mp3 is the future - their products now play mp3, unlike half a year ago. Mp3 is the most common format, my car radio plays it, my iPod (which harddrive crashed half a year ago) plays mp3, my laptop - everything, even our dvd player, plays mp3. Why in gods sake should I buy a aac file? It doesn't play on anything than the iPod ant my Powerbook. Every vendor has it's own format. I wouldn't buy a song in apples itunes music store. Sure, maybe apple would sell mp3 if the music industry would give them the rights to do so, maybe not, but who cares? I don't buy aac, I don't buy wma - mp3 is the past, and the future!
QCassidy352
Jul 12, 09:45 AM
I'd just like to direct all of your attention to this thread (http://forums.macrumors.com/showthread.php?t=211175&highlight=conroe+merom+imac) and ask those of you who said merom was going to be in the imac: what were you thinking? :confused: ;)
I realize it's a little early to be gloating, but c'mon, it's definitely going to be conroe. Which, btw, I find even more exciting than the mac pro news because while I'll never have a mac pro, an imac is always possible. :cool: (though I'm thrilled about woodcrest in the mac pro anyway because it allows the imac to get conroe, and because it's great news for those of you who want a mac pro. :))
I realize it's a little early to be gloating, but c'mon, it's definitely going to be conroe. Which, btw, I find even more exciting than the mac pro news because while I'll never have a mac pro, an imac is always possible. :cool: (though I'm thrilled about woodcrest in the mac pro anyway because it allows the imac to get conroe, and because it's great news for those of you who want a mac pro. :))
dethmaShine
May 2, 09:13 AM
Actually there's been malware for OS X since it was introduced. There is malware for every operating system.
Nothing can defend against user stupidity.
Well, that's true.
Thanks for correcting me.
Nothing can defend against user stupidity.
Well, that's true.
Thanks for correcting me.
Apple OC
Apr 24, 04:53 PM
Many people say this, but they fail at the point where actions are of culture and not representative of the religion itself.
I invite you to demonstrate how Islam is a threat to freedom and democracy.
I guess all this honour killing pretty much explains the original theory how freedom of women has been affected
thanks again edifyingG for presenting some very valid points
I invite you to demonstrate how Islam is a threat to freedom and democracy.
I guess all this honour killing pretty much explains the original theory how freedom of women has been affected
thanks again edifyingG for presenting some very valid points
UnixMac
Oct 8, 10:41 AM
OS X being 25 years old (actually, UNIX is much older) is a GOOD thing, Software (Read OS) can evolve much more easily than hardware. Unix is a work in progress to this day, and this is why it is years (literally years) ahead of windows.
As for X86 being great. I think that sure, the top X86 at 2.8Ghz is faster than the top G4 at 1.25Ghz, but not 2.2 times faster, as the clock would have you think. And when you add Altivec coded software like Photoship, then you actually get more IPC's than the P4. So the archtecture of the G4 is superior, However the P4 is faster by a small margin due to the significant speed advantage and its long pipeline.
I think a G5 with multicore process and a bump in clock will eclips the X86 entirely. AMD is the best bet against the G5 and when that day comes, as it will, this arguement will be moot.
I for one am still waiting on Apple to make a PB worth my $3500 investment. That I think is long overdue.
As for X86 being great. I think that sure, the top X86 at 2.8Ghz is faster than the top G4 at 1.25Ghz, but not 2.2 times faster, as the clock would have you think. And when you add Altivec coded software like Photoship, then you actually get more IPC's than the P4. So the archtecture of the G4 is superior, However the P4 is faster by a small margin due to the significant speed advantage and its long pipeline.
I think a G5 with multicore process and a bump in clock will eclips the X86 entirely. AMD is the best bet against the G5 and when that day comes, as it will, this arguement will be moot.
I for one am still waiting on Apple to make a PB worth my $3500 investment. That I think is long overdue.
appleguy123
Apr 22, 10:33 PM
Would it make a difference if a huge portion of what you've been exposed to, regarding religion/Christianity, was fundamentally incorrect? For example, there's no such place as hellfire; nobody is going to burn forever. Everybody isn't going to heaven; people will live right here on the earth. If you learned that a huge portion of those really crazy doctrines were simply wrong, would it cause you to view Christianity/religion differently?
I would first like to know by what standard you could call those doctrines wrong while verifying your own.
I would first like to know by what standard you could call those doctrines wrong while verifying your own.
LeeTom
Mar 18, 01:55 PM
OPTION 3 - they're sniffing tcp/ip traffic and depending on the traffic can identify if the originating IP has a private addressing scheme. As an ISP, I imagine that you have some leeway to sniff traffic to solve problems, but I'm not sure if this would count as legitimate.
AidenShaw
Sep 21, 11:30 AM
Just the file (which will be cached if the network can't cope).
My point is that it's possible that the "network can't cope", exactly.
My point is that it's possible that the "network can't cope", exactly.
legreve
Apr 6, 04:04 AM
One thing that got me was that you cannot make apps fill the screen without dragging and resizing. You can only resize from the bottom right corner. No real other annoyances for me that I can think of.
That is being dealt with in Lion... you'll be able to resize on all edges.
I was in the same situation as you OP until some 4-5 years ago, when I was introduced to mac through work. I was stubborn and went through the whole "pc is equal to mac and cheaper" rubbish :)
But this also colors me in relations to noticing bad things about OSX.
I agree with the window resizing thing. But since that's taken care off... well.
To be honest I think you need to consider the positive sides as well. Things like not having a visible program folder with all sorts of nice files to click on. It's basically just an icon on a mac (though one that you can explore to reveal the contents).
Another thing... I never fully understood why I had to be bothered with the way a pc starts up. First the loading screen with hardware checks and what not. Then the black screen, then the windows loading screen and if one had it enabled, the login screen, and then the whole preparing of the desktop area to start up services and so on.
Compared to OSX, that is just too much and not being a programmer etc. I couldn't care less with all that initial info the boot screens on a pc gives me.
What you wont like about switching is the extremely closed univers of Apple. You sync items to a specific computer instead of having a free roaming device that can sync anywhere. Crist it's easier to copy files from my work iMac to my HTC phone than to my mates iPhone... ??!!
One thing to add with Apple universe is that I think they are working their way towards an even tighter app store. In the future I could easily see something similar to that Sky idea where you don't own the app but a license to access the online contents :S I think that will take longer to catch on in the pc universe.
Regarding browsers... I work with FF all day. But at home I was used to Explorer 8. I really like Explorer better than FF. Can't explain why, I just feel FF is heavier now to use than IE is. Also it seems like either FF or OSX requires more addons to use the same websites and services than IE on my pc does.
Folders... I'm so used to the whole disk drive with subfolders, fx. E: and then a folder for every little thing I've got.
The OSX system while probably the same, feels different. Best explained by:
OSX: 2-3 cabinets with several drawers and in the drawers are subfolders.
Windows: 1 cabinet, 1 drawer, lots of subfolders. (unless one partitions ones drive :) )
But all in all, I'm really really happy that I switched. My new MBP feels stable, OSX looks nice (I even geeked out and changed the looks on my folders etc :P), and it allows me to concentrate on what is important, and that is not tweaking (I'm not 15-18 anymore), it's getting work done and get it done smoothely.
In general its only about adjusting to a new setting.
That is being dealt with in Lion... you'll be able to resize on all edges.
I was in the same situation as you OP until some 4-5 years ago, when I was introduced to mac through work. I was stubborn and went through the whole "pc is equal to mac and cheaper" rubbish :)
But this also colors me in relations to noticing bad things about OSX.
I agree with the window resizing thing. But since that's taken care off... well.
To be honest I think you need to consider the positive sides as well. Things like not having a visible program folder with all sorts of nice files to click on. It's basically just an icon on a mac (though one that you can explore to reveal the contents).
Another thing... I never fully understood why I had to be bothered with the way a pc starts up. First the loading screen with hardware checks and what not. Then the black screen, then the windows loading screen and if one had it enabled, the login screen, and then the whole preparing of the desktop area to start up services and so on.
Compared to OSX, that is just too much and not being a programmer etc. I couldn't care less with all that initial info the boot screens on a pc gives me.
What you wont like about switching is the extremely closed univers of Apple. You sync items to a specific computer instead of having a free roaming device that can sync anywhere. Crist it's easier to copy files from my work iMac to my HTC phone than to my mates iPhone... ??!!
One thing to add with Apple universe is that I think they are working their way towards an even tighter app store. In the future I could easily see something similar to that Sky idea where you don't own the app but a license to access the online contents :S I think that will take longer to catch on in the pc universe.
Regarding browsers... I work with FF all day. But at home I was used to Explorer 8. I really like Explorer better than FF. Can't explain why, I just feel FF is heavier now to use than IE is. Also it seems like either FF or OSX requires more addons to use the same websites and services than IE on my pc does.
Folders... I'm so used to the whole disk drive with subfolders, fx. E: and then a folder for every little thing I've got.
The OSX system while probably the same, feels different. Best explained by:
OSX: 2-3 cabinets with several drawers and in the drawers are subfolders.
Windows: 1 cabinet, 1 drawer, lots of subfolders. (unless one partitions ones drive :) )
But all in all, I'm really really happy that I switched. My new MBP feels stable, OSX looks nice (I even geeked out and changed the looks on my folders etc :P), and it allows me to concentrate on what is important, and that is not tweaking (I'm not 15-18 anymore), it's getting work done and get it done smoothely.
In general its only about adjusting to a new setting.
jimitrott
Feb 24, 06:07 AM
Android might surpass the iPhone. The iPhone is limited to 1 device whereas the Android is spanned over many more devices and will continue to branch out.
jwdsail
Sep 20, 11:42 AM
Apple iPod Video Express... (I'm hoping to kill the 'Chicken Little' iTV name will get Apple sued stuff)
A hard drive? Hard to believe, I'd think some flash memory as a buffer, maybe 4GB? Perhaps you can add a HD via the USB 2 port? Too small to have a 3.5" drive.. May be too small for a laptop drive.. A 1.8" drive would add too much to the cost, wouldn't it?
I think w/ the HDMI output, and the price, what we're staring at is really a wireless upscaler... Take any content from your Mac, and wirelessly upscale to the native res of your TV (up to 1080p)...
If this is the case, I may just buy one in place of the Mac mini (w/ something other than Intel Integrated *SPIT* Graphics BTO, that will more than likely never happen...) that I've wanted to add to my TV...
Shrug.
Just my $0.02US
jwd
A hard drive? Hard to believe, I'd think some flash memory as a buffer, maybe 4GB? Perhaps you can add a HD via the USB 2 port? Too small to have a 3.5" drive.. May be too small for a laptop drive.. A 1.8" drive would add too much to the cost, wouldn't it?
I think w/ the HDMI output, and the price, what we're staring at is really a wireless upscaler... Take any content from your Mac, and wirelessly upscale to the native res of your TV (up to 1080p)...
If this is the case, I may just buy one in place of the Mac mini (w/ something other than Intel Integrated *SPIT* Graphics BTO, that will more than likely never happen...) that I've wanted to add to my TV...
Shrug.
Just my $0.02US
jwd
bigwig
Oct 27, 06:01 PM
At the rate SGI is going, I could probably buy SGI myself for whatever is in my pocket within the next year. Talk about a company that failed to follow the industry and adapt with the times.
Probably true, and quite sad really. SGI was a heck of a company in its day. I'm not sure they could have adapted. Once everybody else abandoned MIPS SGI couldn't afford new processor revisions by themselves, and the false promise that was (and is) Itanium irrevocably doomed them. Itanium basically killed off all the competition when the Unix vendors all hopped on the Itanium bandwagon, and Intel's complete failure to deliver on Itanium's promises looks in hindsight to have been Intel's plan all along. Just think of the performance a MIPS cpu would have were it given the development dollars x86 gets.
No point in anyone buying them, the only thing keeping them afloat is the few tidbits of technology they've licensed over the years, which is all just about obsolete now anyway.
SGI's technology isn't so much obsolete (who else sells systems with the capacity of an Altix 4700?) as it is unnecessary. 4 CPU Intel machines do just fine for 99.9% of people these days, and the kind of problems SGI machines are good at solving are a tiny niche. That's not just number crunching, a big SGI machine has I/O capacity that smokes a PC cluster.
Probably true, and quite sad really. SGI was a heck of a company in its day. I'm not sure they could have adapted. Once everybody else abandoned MIPS SGI couldn't afford new processor revisions by themselves, and the false promise that was (and is) Itanium irrevocably doomed them. Itanium basically killed off all the competition when the Unix vendors all hopped on the Itanium bandwagon, and Intel's complete failure to deliver on Itanium's promises looks in hindsight to have been Intel's plan all along. Just think of the performance a MIPS cpu would have were it given the development dollars x86 gets.
No point in anyone buying them, the only thing keeping them afloat is the few tidbits of technology they've licensed over the years, which is all just about obsolete now anyway.
SGI's technology isn't so much obsolete (who else sells systems with the capacity of an Altix 4700?) as it is unnecessary. 4 CPU Intel machines do just fine for 99.9% of people these days, and the kind of problems SGI machines are good at solving are a tiny niche. That's not just number crunching, a big SGI machine has I/O capacity that smokes a PC cluster.
No comments:
Post a Comment