NickUK69
Apr 11, 11:37 AM
Also... with many people on 12 and 18 month contracts, mobile carriers will be wanting to keep customers by signing them to new contracts and in doing so, a new phone. There will be no new iPhone and people will have 'beaten up' 18 month old equipment which they will want to renew and there will be no iPhone, so Android could be onto a winner here!
Hi
With all the Android phones coming out and manufacturers having no specific cycle, the iPhone is really out of date already!
iPhone 1 - 2G
iPhone 2 - adds 3G
iPhone 3 - adds 3GS
Therefore the above three phones are all 'old' regarding what was released around the same time.
iPhone 4 - will be about 18 months old by the time the iPhone 5 comes out.
People will loose interest in Apple iPhones with so many other new releases coming out on a regular basis.
Hi
With all the Android phones coming out and manufacturers having no specific cycle, the iPhone is really out of date already!
iPhone 1 - 2G
iPhone 2 - adds 3G
iPhone 3 - adds 3GS
Therefore the above three phones are all 'old' regarding what was released around the same time.
iPhone 4 - will be about 18 months old by the time the iPhone 5 comes out.
People will loose interest in Apple iPhones with so many other new releases coming out on a regular basis.
shawnce
Sep 13, 11:48 AM
Yes, that's true.
It's also true that most of the time, most people aren't even maxing out ONE core never mind eight.
And when they do, their program won't get any faster unless it's multithreaded and able to run on multiple cores at once.
Lets not forget things like Spotlight that can now run more rigorously without affecting CPU resource much. You will get more intelligent software that can prepare for what you want to do so that when you go to do it it will be much more responsive. In other words just because some tasks cannot be easily broken up to leverage multiple cores doesn't mean that tasks such as those cannot be speculative run by software on idle cores in preparation for you doing the task.
It's also true that most of the time, most people aren't even maxing out ONE core never mind eight.
And when they do, their program won't get any faster unless it's multithreaded and able to run on multiple cores at once.
Lets not forget things like Spotlight that can now run more rigorously without affecting CPU resource much. You will get more intelligent software that can prepare for what you want to do so that when you go to do it it will be much more responsive. In other words just because some tasks cannot be easily broken up to leverage multiple cores doesn't mean that tasks such as those cannot be speculative run by software on idle cores in preparation for you doing the task.
babyj
Nov 28, 07:57 PM
This isn't a new story - at least one of the major labels was talking about wanting a payment for every iPod sold prior to the last round of contract deals.
Their reasoning was nothing to do with the blank tape / copied music argument - they said that their music was driving sales of iPods so they deserved a cut of iPod profits from Apple.
How they said it with a straight face I'll never know.
This isn't about getting money to the artists that deserve it, this is all about increasing the profits of the major record labels. They don't give a damn about anything, certainly not their artists, they just care about their own profit.
Though I think their biggest problem is that they have looked in to the future and have realised that it doesn't include them and they are worried. Who needs record labels with digital distribution? How long before a major artist signs a deal directly with a digital shop or distributor and cuts out the record label?
Their reasoning was nothing to do with the blank tape / copied music argument - they said that their music was driving sales of iPods so they deserved a cut of iPod profits from Apple.
How they said it with a straight face I'll never know.
This isn't about getting money to the artists that deserve it, this is all about increasing the profits of the major record labels. They don't give a damn about anything, certainly not their artists, they just care about their own profit.
Though I think their biggest problem is that they have looked in to the future and have realised that it doesn't include them and they are worried. Who needs record labels with digital distribution? How long before a major artist signs a deal directly with a digital shop or distributor and cuts out the record label?
Leet Apple
Mar 2, 09:53 PM
Well Catholic people believe its a Sin to be gay, and in fear of parents saying anything about a gay man teaching their kids....Well being gay and teaching at a religious school and being gay just doesnt work...that sucks though for him
Yebubbleman
Apr 6, 01:57 PM
http://www.macrumors.com/images/macrumorsthreadlogo.gif (http://www.macrumors.com/2011/04/06/intel-launching-next-generation-macbook-air-processors/)
http://images.macrumors.com/article/2011/02/11/094654-mba.jpg
As reported by Fudzilla (http://www.fudzilla.com/processors/item/22323-new-17w-core-i7-king-brand-is-2657m) and HardMac (http://www.hardmac.com/news/2011/04/06/intel-to-launch-sandy-bridge-chips-that-could-be-found-in-the-new-macbook-air), Intel is about to launch its next generation Sandy Bridge ultra low voltage CPUs suitable for the MacBook Air.
Due to the MacBook Air's thin form factor, it has required the use of particularly low power CPUs from Intel. Apple has stuck with Core 2 Duo processors with a maximum Thermal Design Power (TDP) of 10-17W. Apple is believed to have continued to use this older processor design in order to keep NVIDIA's graphics chips powering their ultracompact notebook. Due to licensing disputes (http://www.macrumors.com/2011/01/10/nvidia-and-intel-settle-nvidia-still-prohibited-from-building-chipsets-for-newest-intel-processors/), NVIDIA was prohibited from building newer chipsets that supported Intel's newest processors.
With the release of Sandy Bridge, Intel upgraded the performance of their integrated graphics chipset. This was good enough (http://www.macrumors.com/2011/02/24/apple-launches-macbook-pros-with-thunderbolt-quad-core-cpus-amd-gpus/) for Apple to offer in their latest 13" MacBook Pros, so we expect it will be good enough for the upcoming MacBook Airs as well. Apple had been previously rumored (http://www.macrumors.com/2011/02/11/macbook-air-sandy-bridge-update-in-june/) to be introducing the "Sandy Bridge" MacBook Airs this June.
HardMac pinpoints the Core i5 2537M (17W) as the possible chip to be used, at least in the 13" model:Meanwhile, the current 11" MacBook air uses an even lower power (10W) processor, but it's not clear how much power savings is offered by removing the need for the NVIDIA graphics chipset, as the Intel solution is integrated within the processor itself.
Article Link: Intel Launching Next Generation MacBook Air Processors (http://www.macrumors.com/2011/04/06/intel-launching-next-generation-macbook-air-processors/)
Woo! Something not MacBook Pro or iOS related!
For a programmer dealing with Terminal, Xcode, Netbeans, Eclipse, etc (not graphic intensive softwares), would this macbook air be a better deal than the 13/15" Macbook pro?
Anyone?
Both machines would be fine, though the 13"/15" MacBook Pro is more fully-featured of a machine than the Air, and frankly at that cost, why pay for an incomplete system?
awesome!!! this is really tempting. Should I throw an SSD in my 2010 4GB 2.66 GHz 13" MBP or sell it and wait for the MBA refresh?
If you have to do either, I'd do the former. But I'm in the "screw the Intel HD 3000 bandwaggon" and I also don't think that an Air should replace a Pro unless you have a problem lifting five pounds.
I would love to see a 15" laptop with no optical drive, with the specs and price somewhere between the MBA and MBP.
KEEP DREAMING!
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
First off, it's a fair assumption that they won't be more powerful than the current 13" MacBook Pros. Second, it's a fair assumption that the Intel HD 3000 will be in tow, and if the SV HD 3000 is inferior to the SV NVIDIA GeForce 320M, then it's a fair assumption that the ULV versions will probably have a similar result when benchmarked. Then again, who in their right mind is relying on a MacBook Air to play games over say, a MacBook Pro or a P-freakin-C?!
Alas, there are some things that the curated app store will never be able to supply. Case in point: a pokerstars or fulltilt client. And if the ipad's Safari can't do java or flash or allow me to run the applications of my choosing, then it's not sufficiently open for my needs.
Most Flash and Java based sites have App-equivalents. Ideal, no. A true web experience, no. But there's an app for that.
At least I now have a short finite timeline to work with to buy my 13"/2.13GHz C2D/256GB MBA before they "upgrade" it to a vastly inferior Intel GPU.
It's a MacBook Air, for crying out loud! What were you going to use the GeForce 320M for anyway that you won't be able to do with the Intel HD 3000? (Note: Final Cut Studio type things and gaming, which are the only two things that you'd feel the difference between the two IGPs on anyway, are laughable answers.)
I LOL'd. I owned iPad 1 for a year, and while it's nice, it's a FAR, FAR cry from the productivity capabilities of the current gen MBA.
Like it or not, iPad is SEVERELY CRIPPLED for content creation (i.e. real work), but excels at content CONSUMPTION. That's factual and completely undebatable. Everyone knows this.
So, no, it's not "something better". It's a more viable choice for entertainment and consumption. That's it.
The MacBook Air is crippled for content creation purposes. It is no MacBook Pro. The iPad is not crippled for content consumption. Sure, the iPad isn't yet the most stellar option for content creation purposes, but it's not crippled for what it's intended to do. With a 13" or 15" MacBook Pro, there's little practical use for a MacBook Air unless you have a problem lifting the two extra pounds, and really, if you do, either exercise or invest in physical therapy.
What is the obsession with back-lit keys?
Do you actually look at the keyboard when you're typing?
It's more that a feature was taken away and the natural psychological response when that happens is "Why did you do that? Give it back!"
The current nvidia chip is also integrated so it's not that much of a step down. As a 13" Pro user I can happily tell anyone that for what the product is made for, it's perfectly usable. At first I was pissed at the idea but it turned out the Intel HD 3000 were more powerful than the graphics in my old laptop.
It's a step down from the GeForce 320M, but a step up from the GeForce 9400M and the Intel GMA IGPs used before it.
By game I mean a modern title at full settings. Otherwise it's just 'making do'.
+1
sorry but if you're trying to do "pro" work on a MBA, ur doin it wrong.
i'm glad Apple has their MBA line for ultra-portability, plus the MBP line for intensive portable work.
This.
http://images.macrumors.com/article/2011/02/11/094654-mba.jpg
As reported by Fudzilla (http://www.fudzilla.com/processors/item/22323-new-17w-core-i7-king-brand-is-2657m) and HardMac (http://www.hardmac.com/news/2011/04/06/intel-to-launch-sandy-bridge-chips-that-could-be-found-in-the-new-macbook-air), Intel is about to launch its next generation Sandy Bridge ultra low voltage CPUs suitable for the MacBook Air.
Due to the MacBook Air's thin form factor, it has required the use of particularly low power CPUs from Intel. Apple has stuck with Core 2 Duo processors with a maximum Thermal Design Power (TDP) of 10-17W. Apple is believed to have continued to use this older processor design in order to keep NVIDIA's graphics chips powering their ultracompact notebook. Due to licensing disputes (http://www.macrumors.com/2011/01/10/nvidia-and-intel-settle-nvidia-still-prohibited-from-building-chipsets-for-newest-intel-processors/), NVIDIA was prohibited from building newer chipsets that supported Intel's newest processors.
With the release of Sandy Bridge, Intel upgraded the performance of their integrated graphics chipset. This was good enough (http://www.macrumors.com/2011/02/24/apple-launches-macbook-pros-with-thunderbolt-quad-core-cpus-amd-gpus/) for Apple to offer in their latest 13" MacBook Pros, so we expect it will be good enough for the upcoming MacBook Airs as well. Apple had been previously rumored (http://www.macrumors.com/2011/02/11/macbook-air-sandy-bridge-update-in-june/) to be introducing the "Sandy Bridge" MacBook Airs this June.
HardMac pinpoints the Core i5 2537M (17W) as the possible chip to be used, at least in the 13" model:Meanwhile, the current 11" MacBook air uses an even lower power (10W) processor, but it's not clear how much power savings is offered by removing the need for the NVIDIA graphics chipset, as the Intel solution is integrated within the processor itself.
Article Link: Intel Launching Next Generation MacBook Air Processors (http://www.macrumors.com/2011/04/06/intel-launching-next-generation-macbook-air-processors/)
Woo! Something not MacBook Pro or iOS related!
For a programmer dealing with Terminal, Xcode, Netbeans, Eclipse, etc (not graphic intensive softwares), would this macbook air be a better deal than the 13/15" Macbook pro?
Anyone?
Both machines would be fine, though the 13"/15" MacBook Pro is more fully-featured of a machine than the Air, and frankly at that cost, why pay for an incomplete system?
awesome!!! this is really tempting. Should I throw an SSD in my 2010 4GB 2.66 GHz 13" MBP or sell it and wait for the MBA refresh?
If you have to do either, I'd do the former. But I'm in the "screw the Intel HD 3000 bandwaggon" and I also don't think that an Air should replace a Pro unless you have a problem lifting five pounds.
I would love to see a 15" laptop with no optical drive, with the specs and price somewhere between the MBA and MBP.
KEEP DREAMING!
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
First off, it's a fair assumption that they won't be more powerful than the current 13" MacBook Pros. Second, it's a fair assumption that the Intel HD 3000 will be in tow, and if the SV HD 3000 is inferior to the SV NVIDIA GeForce 320M, then it's a fair assumption that the ULV versions will probably have a similar result when benchmarked. Then again, who in their right mind is relying on a MacBook Air to play games over say, a MacBook Pro or a P-freakin-C?!
Alas, there are some things that the curated app store will never be able to supply. Case in point: a pokerstars or fulltilt client. And if the ipad's Safari can't do java or flash or allow me to run the applications of my choosing, then it's not sufficiently open for my needs.
Most Flash and Java based sites have App-equivalents. Ideal, no. A true web experience, no. But there's an app for that.
At least I now have a short finite timeline to work with to buy my 13"/2.13GHz C2D/256GB MBA before they "upgrade" it to a vastly inferior Intel GPU.
It's a MacBook Air, for crying out loud! What were you going to use the GeForce 320M for anyway that you won't be able to do with the Intel HD 3000? (Note: Final Cut Studio type things and gaming, which are the only two things that you'd feel the difference between the two IGPs on anyway, are laughable answers.)
I LOL'd. I owned iPad 1 for a year, and while it's nice, it's a FAR, FAR cry from the productivity capabilities of the current gen MBA.
Like it or not, iPad is SEVERELY CRIPPLED for content creation (i.e. real work), but excels at content CONSUMPTION. That's factual and completely undebatable. Everyone knows this.
So, no, it's not "something better". It's a more viable choice for entertainment and consumption. That's it.
The MacBook Air is crippled for content creation purposes. It is no MacBook Pro. The iPad is not crippled for content consumption. Sure, the iPad isn't yet the most stellar option for content creation purposes, but it's not crippled for what it's intended to do. With a 13" or 15" MacBook Pro, there's little practical use for a MacBook Air unless you have a problem lifting the two extra pounds, and really, if you do, either exercise or invest in physical therapy.
What is the obsession with back-lit keys?
Do you actually look at the keyboard when you're typing?
It's more that a feature was taken away and the natural psychological response when that happens is "Why did you do that? Give it back!"
The current nvidia chip is also integrated so it's not that much of a step down. As a 13" Pro user I can happily tell anyone that for what the product is made for, it's perfectly usable. At first I was pissed at the idea but it turned out the Intel HD 3000 were more powerful than the graphics in my old laptop.
It's a step down from the GeForce 320M, but a step up from the GeForce 9400M and the Intel GMA IGPs used before it.
By game I mean a modern title at full settings. Otherwise it's just 'making do'.
+1
sorry but if you're trying to do "pro" work on a MBA, ur doin it wrong.
i'm glad Apple has their MBA line for ultra-portability, plus the MBP line for intensive portable work.
This.
deconai
Aug 11, 03:37 PM
Well now you ignorant yankie ;) Firstly the mobile phone penetration in Europe is about 99% or maybe slighly more. You should really travel a bit to get some perspective.
And secondly, GSM has user base of over 1 billion while CDMA as you said has some 60m users. Which one you think would be more interesting market to cover for a new mobile phone manufacturer? And there is really no question of "we'll see which one wins" because GSM won a long long time ago, hands down.
Are you saying 99% of Europeans use cell phones or that 99% of Europe is cell-ready? If the former, then there must be a ton of kids yapping it up on the wireless. ;)
And secondly, GSM has user base of over 1 billion while CDMA as you said has some 60m users. Which one you think would be more interesting market to cover for a new mobile phone manufacturer? And there is really no question of "we'll see which one wins" because GSM won a long long time ago, hands down.
Are you saying 99% of Europeans use cell phones or that 99% of Europe is cell-ready? If the former, then there must be a ton of kids yapping it up on the wireless. ;)
milo
Jul 14, 02:58 PM
Can anyone tell me the purpose of dual drive slots nowadays? I can see the use for them (and had computers with) when they were limited to one function, i.e. DVD-ROM for one and a CD-RW for the other but now that everything can happen in one drive with speed not being an issue, is it really nececcary to have two?
Same purpose. DVD-ROM in one, bluray or HD-DVD in the other. Plus two are nice for duping.
Too expensive on the low-end, if true. I suspect we'll see a lot of reviews and benchmarks giving a bad cost to value ratio for the Macs.
You obviously haven't shopped around. Price out machines with these CPU's at Dell, you're looking at $2400/2600/3700. I think these prices are too *low* based on chip prices and current PC prices. I think that whole grid is bogus.
As for the 3G chip, it could be a BTO option. I assume other video cards would be BTO options as well.
Same purpose. DVD-ROM in one, bluray or HD-DVD in the other. Plus two are nice for duping.
Too expensive on the low-end, if true. I suspect we'll see a lot of reviews and benchmarks giving a bad cost to value ratio for the Macs.
You obviously haven't shopped around. Price out machines with these CPU's at Dell, you're looking at $2400/2600/3700. I think these prices are too *low* based on chip prices and current PC prices. I think that whole grid is bogus.
As for the 3G chip, it could be a BTO option. I assume other video cards would be BTO options as well.
gnasher729
Aug 17, 09:17 AM
Edit: Please ignore this post, I can't count!!!
If you buy a Xeon 5160 (3.0GHz) at the moment they are �570. Apple are charging �530 to upgrade from Xeon 5150 (2.66GHz) to the Xeon 5160. Bearing in mind that you can probably sell the original 2.66Gz chip for around �300, it would be cheaper to buy the lower spec Mac Pro and upgrade yourself.
It's not something I would do myself, but some enterprising dealer could easily do that. In US prices, difference between two 3.00 GHz and two 2.66 GHz chips is roughly $300, and you could sell the 2.66 GHz ones at a premium because they had "extra burn-in testing" :-)
Seriously, the problem is getting money for the 2.66 chips.
If you buy a Xeon 5160 (3.0GHz) at the moment they are �570. Apple are charging �530 to upgrade from Xeon 5150 (2.66GHz) to the Xeon 5160. Bearing in mind that you can probably sell the original 2.66Gz chip for around �300, it would be cheaper to buy the lower spec Mac Pro and upgrade yourself.
It's not something I would do myself, but some enterprising dealer could easily do that. In US prices, difference between two 3.00 GHz and two 2.66 GHz chips is roughly $300, and you could sell the 2.66 GHz ones at a premium because they had "extra burn-in testing" :-)
Seriously, the problem is getting money for the 2.66 chips.
SevenInchScrew
Nov 24, 01:20 PM
...I can't say how this compares to GT4 but so far it's been amazing
You have 800 cars exactly as they were in GT4, so you'll get a good idea. :p
My buddy picked this up today, so I'll be checking it out on Friday when we hang out. I'm not buying it without trying first. It will be interesting to see how well it plays. After waiting 6 years for another full Gran Turismo, I have big expectations. But hey, even if it doesn't play as well as I'm hoping, the photo mode looks excellent. I can spend a LOT of time in there.
You have 800 cars exactly as they were in GT4, so you'll get a good idea. :p
My buddy picked this up today, so I'll be checking it out on Friday when we hang out. I'm not buying it without trying first. It will be interesting to see how well it plays. After waiting 6 years for another full Gran Turismo, I have big expectations. But hey, even if it doesn't play as well as I'm hoping, the photo mode looks excellent. I can spend a LOT of time in there.
inkswamp
Jul 28, 04:34 AM
gnasher729, thanks for taking the time to explain that. I had to read it twice, but I get it.
So it seems that in many ways we're getting the best of the G5 and the best of Intel with the Core 2 Duo chips. As these kinds of things unfold, Apple's decision to switch to Intel chips makes more and more sense. They probably knew where Intel was going. Interesting.
So it seems that in many ways we're getting the best of the G5 and the best of Intel with the Core 2 Duo chips. As these kinds of things unfold, Apple's decision to switch to Intel chips makes more and more sense. They probably knew where Intel was going. Interesting.
Sine Qua Non
Apr 25, 03:47 PM
[QUOTE=killr_b;12458559]As a consumer, why should I be subjected to this risk which doesn't benefit me in the slightest? And why should this data be "backed up," secretly, to my computer?[QUOTE]
Your phone stores this so as to keep from having to re-install connection locations every time you move a few hundred feet. It's "subjecting" you to better battery life by not having to work as hard to keep you connected. Oh noes.
...And it backs up the data for the same reason it backs up EVERYTHING when you sync -- so you can restore without losing any of the data on the phone.
What, you want crappy battery life, slower speeds, and loss of data if you need to restore your phone?
.
All I can say is that I've encountered none of these horrors since installing untrackerd last week.
Your phone stores this so as to keep from having to re-install connection locations every time you move a few hundred feet. It's "subjecting" you to better battery life by not having to work as hard to keep you connected. Oh noes.
...And it backs up the data for the same reason it backs up EVERYTHING when you sync -- so you can restore without losing any of the data on the phone.
What, you want crappy battery life, slower speeds, and loss of data if you need to restore your phone?
.
All I can say is that I've encountered none of these horrors since installing untrackerd last week.
NoSmokingBandit
Sep 1, 11:15 AM
Idk, that just doesnt sound right...
They have higher-res models from the GT4/GTPSP artists (everything 3d is made with super high poly counts then downgraded as the game's engine requires) so i dont understand why they would use the low poly models from GT4 when it would take just as much time to export a higher res model from Maya.
Time will tell i suppose, but it just doesnt make sense for them to gimp standard cars for no reason.
They have higher-res models from the GT4/GTPSP artists (everything 3d is made with super high poly counts then downgraded as the game's engine requires) so i dont understand why they would use the low poly models from GT4 when it would take just as much time to export a higher res model from Maya.
Time will tell i suppose, but it just doesnt make sense for them to gimp standard cars for no reason.
jamesryanbell
Apr 6, 10:51 AM
I have something better than a MacBook Air. It's called an iPad 2.
I LOL'd. I owned iPad 1 for a year, and while it's nice, it's a FAR, FAR cry from the productivity capabilities of the current gen MBA.
Like it or not, iPad is SEVERELY CRIPPLED for content creation (i.e. real work), but excels at content CONSUMPTION. That's factual and completely undebatable. Everyone knows this.
So, no, it's not "something better". It's a more viable choice for entertainment and consumption. That's it.
I LOL'd. I owned iPad 1 for a year, and while it's nice, it's a FAR, FAR cry from the productivity capabilities of the current gen MBA.
Like it or not, iPad is SEVERELY CRIPPLED for content creation (i.e. real work), but excels at content CONSUMPTION. That's factual and completely undebatable. Everyone knows this.
So, no, it's not "something better". It's a more viable choice for entertainment and consumption. That's it.
ictiosapiens
Aug 17, 04:39 AM
Could you give some evidence for that, except that they are underclocked on the MacBook Pro _when they are idle_?
And the Macbook... Nearly 50% underclocked, like the 950 was so amazing that it could be crippled by half of its mindblowing performance...
And the Macbook... Nearly 50% underclocked, like the 950 was so amazing that it could be crippled by half of its mindblowing performance...
dustinsc
Mar 22, 12:52 PM
Blackberry playbook = The IPad 2 killer - you heard it here first.
Look at the specs, their greater or equal to the iPad 2 with the exception of battery life.
Well, minus the screen size too. Equal to isn't going to cut it against an Apple product. Just look at how the Zune fared.
Look at the specs, their greater or equal to the iPad 2 with the exception of battery life.
Well, minus the screen size too. Equal to isn't going to cut it against an Apple product. Just look at how the Zune fared.
skunk
Feb 28, 06:04 PM
A same-sex attracted person is living a "gay lifestyle" when he or she dates people of the same sex, "marries" people of the same sex, has same-sex sex, or does any combination of these things.No, it's called "living a human lifestyle".
I think that if same-sex attracted people are going to live together, they need to do that as though they were siblings, not as sex partners. In my opinion, they should have purely platonic, nonsexual relationships with one another.Why should your hang-ups be of any relevance to anybody else? Perhaps you need to deal with your own perceptions instead of relying on some dusty tome to tell you what to think. You know that Plato was a repressed homosexual, don't you? He spent hours at the gymnasium ogling naked young men, and perhaps like S/Paul, spent a lot of effort telling other people how to love to expiate his guilty feelings.
Heterosexual couples need to reserve sex for opposite-sex monogamous marriage.You are extraordinarily keen to prescribe what other people should do. What's it got to do with you?
If I had a girlfriend, I might kiss her. But I wouldn't do that to deliberately arouse either of us. If either of us felt tempted to have sex with each other, the kissing would stop right away.You sound like a real catch, but hey, what you choose to do is up to you.
Sacramentally same-sex "marriage" isn't marriage. Neither is merely civil marriage of any sort. If I understand what the Catholic Church's teachings about marriage merely civil, it teaches non-sacramental marriage, whether same-sex or opposite-sex, is legal fornication.So, you assert that a married non-Christian couple can do nothing but fornicate? What an appallingly demeaning attitude! Do you regard any couple you meet as probable fornicators by default? Do you question them about whether they use birth control, or whether they were married, and if so whether it was in a Catholic church with the proper sacraments? You clearly swallow Catholic dogma hook, line and sinker, so choosing righteous friends must be a real PITA.
I think that if same-sex attracted people are going to live together, they need to do that as though they were siblings, not as sex partners. In my opinion, they should have purely platonic, nonsexual relationships with one another.Why should your hang-ups be of any relevance to anybody else? Perhaps you need to deal with your own perceptions instead of relying on some dusty tome to tell you what to think. You know that Plato was a repressed homosexual, don't you? He spent hours at the gymnasium ogling naked young men, and perhaps like S/Paul, spent a lot of effort telling other people how to love to expiate his guilty feelings.
Heterosexual couples need to reserve sex for opposite-sex monogamous marriage.You are extraordinarily keen to prescribe what other people should do. What's it got to do with you?
If I had a girlfriend, I might kiss her. But I wouldn't do that to deliberately arouse either of us. If either of us felt tempted to have sex with each other, the kissing would stop right away.You sound like a real catch, but hey, what you choose to do is up to you.
Sacramentally same-sex "marriage" isn't marriage. Neither is merely civil marriage of any sort. If I understand what the Catholic Church's teachings about marriage merely civil, it teaches non-sacramental marriage, whether same-sex or opposite-sex, is legal fornication.So, you assert that a married non-Christian couple can do nothing but fornicate? What an appallingly demeaning attitude! Do you regard any couple you meet as probable fornicators by default? Do you question them about whether they use birth control, or whether they were married, and if so whether it was in a Catholic church with the proper sacraments? You clearly swallow Catholic dogma hook, line and sinker, so choosing righteous friends must be a real PITA.
gnasher729
Aug 18, 03:31 PM
Thats showing that the quad core Mac Pro is essentially the same speed as dual core Mac Pro. To translate it to normal mac scenario: If apple releases a 2.66GHz Conroe iMac/Mac/whathaveyou it will be able to crunch through FCP/Photoshop/etc faster than a Mac Pro because it can use regular DDR2 and won't suffer from horrendous memory latency.
It only shows that one company can expect to get massive complaints from its customers soon about its crappy software. An H.264 encoder can easily use two dozen cores if they are there (apart from the fact that it might be limited by the speed of the DVD drive if you encode straight from DVD); there is no reason at all why this software shouldn't be twice as fast on a Quad core and four times as fast on an eight core machine.
It only shows that one company can expect to get massive complaints from its customers soon about its crappy software. An H.264 encoder can easily use two dozen cores if they are there (apart from the fact that it might be limited by the speed of the DVD drive if you encode straight from DVD); there is no reason at all why this software shouldn't be twice as fast on a Quad core and four times as fast on an eight core machine.
dethmaShine
Apr 19, 02:58 PM
Apple may have expanded upon existing GUI elements, but it didn't invent the GUI. Very big difference there.
What's the issue?
When did I ever say Apple invented the GUI?
What's the issue?
When did I ever say Apple invented the GUI?
LagunaSol
Apr 11, 01:27 PM
given Apple's increasing tendancy to underwhelm us with new technology features (which are in fact old by the time of their introduction 1-2 years after everyone else), I doubt we get any of these three.
Yeah, like all those trailblazing Android tablets that are 1-2 years ahead of the iPad, right? :rolleyes:
Yeah, like all those trailblazing Android tablets that are 1-2 years ahead of the iPad, right? :rolleyes:
Zimmy68
Apr 7, 11:36 PM
If there is one indisputable fact of this world...
Those on message boards that say they hate Best Buy, are the first to grab the Sunday ad and visit the store at least weekly.
Bank on it.
Those on message boards that say they hate Best Buy, are the first to grab the Sunday ad and visit the store at least weekly.
Bank on it.
Chupa Chupa
Apr 10, 07:41 PM
There is a part of me that hopes Apple screws up and dumbs down FCS. This is the only remaining software that keeps me buying expensive Macs. If they turn FCS into a glorified iApp, then I'm dumping my Mac's and moving on to a build your own PC where I can run Linux and all of the industry standard professional apps.
I think that with this new release of FinalCut, Apple is going to shove a dagger into it's professional line. In the last keynote, Jobs mentioned the "transition from a post-PC" business model. The only way that Apple can devote itself exclusively to iStuff is to wean the professional's away from using their products. Once FCS becomes a new video editing program aimed more for the masses running on iPads, Apple will be able to say that they don't have a need for the pro line of computers anymore. Say goodbye to MacPro anything.
Whatever Apple announces Tuesday is going to be a strong indicator for the future of the professional line. If they announce an amazing FCS 4 for professionals, then we will know they are committed to the long run. However, if they turn FinalCut into some kind of cheesy video editing app for the mass consumer, then you better start rethinking your professional future with Apple - unless you make your money from making crappy youtube videos.
So munch elitism there it's dripping off my screen. Your post is funny b/c when FCP 1.0 was announced the many of "pro" editors of the time gasped b/c it, well, "dumbed down" editing, similar to how Pagemaker 1.0 dumbed down publishing.
What Apple does best, what it's always done best, is define new paradigms. It sounds like that is what may happen on Tues. Clearly, for all your snobbery, you are a horse and buggy driver and not a buyer into the Model T thing. Enjoy your Linux, but physical media is still dying, nonetheless. Editing for the web needs a new set of editing tools. YouTube has a lot of professionally edited material. It's not all cell phone clips.
I think that with this new release of FinalCut, Apple is going to shove a dagger into it's professional line. In the last keynote, Jobs mentioned the "transition from a post-PC" business model. The only way that Apple can devote itself exclusively to iStuff is to wean the professional's away from using their products. Once FCS becomes a new video editing program aimed more for the masses running on iPads, Apple will be able to say that they don't have a need for the pro line of computers anymore. Say goodbye to MacPro anything.
Whatever Apple announces Tuesday is going to be a strong indicator for the future of the professional line. If they announce an amazing FCS 4 for professionals, then we will know they are committed to the long run. However, if they turn FinalCut into some kind of cheesy video editing app for the mass consumer, then you better start rethinking your professional future with Apple - unless you make your money from making crappy youtube videos.
So munch elitism there it's dripping off my screen. Your post is funny b/c when FCP 1.0 was announced the many of "pro" editors of the time gasped b/c it, well, "dumbed down" editing, similar to how Pagemaker 1.0 dumbed down publishing.
What Apple does best, what it's always done best, is define new paradigms. It sounds like that is what may happen on Tues. Clearly, for all your snobbery, you are a horse and buggy driver and not a buyer into the Model T thing. Enjoy your Linux, but physical media is still dying, nonetheless. Editing for the web needs a new set of editing tools. YouTube has a lot of professionally edited material. It's not all cell phone clips.
LanPhantom
Mar 31, 02:41 PM
How is it biting them in the ass? Android is the fastest growing OS with a larger share than IOS. I think it's been a very succesfull strategy.
It's because of the Buy One Get One option. Nothing more. People choose that option because it makes financial sense and if they don't really care about the OS or the phone, they will choose the one that fits their check books. If Apple was to OK ATT and VZ to do a Buy One Get One on the iPhone, there would be no comparison. It would be game over for Android.
-LanPhantom
It's because of the Buy One Get One option. Nothing more. People choose that option because it makes financial sense and if they don't really care about the OS or the phone, they will choose the one that fits their check books. If Apple was to OK ATT and VZ to do a Buy One Get One on the iPhone, there would be no comparison. It would be game over for Android.
-LanPhantom
fraserdrew
Aug 6, 04:12 PM
Vista is also 6 months out, prob more. This is no different then when Apple released 10.0. There WAS a reason 10.1 was free to 10.0 users. Microsoft will get this cleaned up over the 18+ months it takes Apple to come out with 10.6. Leopard has to go the distance and I two have been using Vista inhouse since early Alpha's for internal app testing. Its come a long way. It still has a long way to go still but the core IS there. MS simply needs to bug fix the heck out of it. Which will happen within 2-4 months of release with SP1 and then SP2 another 6 months after that.
I'm not a long time apple user, and don't know about the classic to OS X transition, but i do know that 2 service packs and bug fixes every month did nothing to XP, hence my move to OS X. So, ok i assumed that this will be the same case with vista, but considering the fact that (i think) concept viruses have already been written, and that microsoft really are up against the clock; i think that for at least the first year vista will be hellish.
After that, ok, maybe things will change, but it seems to me that this isn't the biggest upgrade ever (i'm an end user, and mainly use PC's for web-browsing and school work, so i haven't seen any major good things in vista) and microsoft have struggled to get it out. (sorry kinda off topic)
I'm not a long time apple user, and don't know about the classic to OS X transition, but i do know that 2 service packs and bug fixes every month did nothing to XP, hence my move to OS X. So, ok i assumed that this will be the same case with vista, but considering the fact that (i think) concept viruses have already been written, and that microsoft really are up against the clock; i think that for at least the first year vista will be hellish.
After that, ok, maybe things will change, but it seems to me that this isn't the biggest upgrade ever (i'm an end user, and mainly use PC's for web-browsing and school work, so i haven't seen any major good things in vista) and microsoft have struggled to get it out. (sorry kinda off topic)
512ke
Aug 26, 03:57 PM
If the power consumption is the same... does that mean that the Merom and the current chips suck the same amount energy while going full throttle?
If the above is true, if you turned down the Merom to match the speed of the current chips, wouldn't the Merom be drawing 20% less power?
In other words if the Merom and the current chip were both going 60 mph down the freeway, would the Merom be drawing less power?
Am I missing something here (such as the basics of electricity, the basic way that chips work, etc.)?
512ke
If the above is true, if you turned down the Merom to match the speed of the current chips, wouldn't the Merom be drawing 20% less power?
In other words if the Merom and the current chip were both going 60 mph down the freeway, would the Merom be drawing less power?
Am I missing something here (such as the basics of electricity, the basic way that chips work, etc.)?
512ke
No comments:
Post a Comment