iJohnHenry
Apr 23, 07:44 PM
But this doesn't answer the question at all.
Apple users question. Atheists/Agnostics question.
You see a trend yet?
Apple users question. Atheists/Agnostics question.
You see a trend yet?
jealousguy86
Apr 20, 10:13 PM
buying an iphone 4 tomorrow after reading the news today and just saying "screw it."
it seems that all the discussion lately from these "sources" is saying that the iphone 5 will basically come out in september/fall, have an 8mp camera, and have the A5 chip. and after what Cook said today, it's obvious there won't be 4G with this new phone, which i didn't really care about in the first place.
so it seems like what most are saying is that this next iphone will be a modest update from the incredible iphone 4, which i think is awesome already. so i figure, get it now, then iphone 6 next year maybe around june/july 2012, i'll just upgrade then to that.
instead of waiting another 5 or 6 months, then getting iphone 5, i could just have the iphone already for those months, then it wouldn't be bad waiting for the 6th iteration. oh well, anything's better than the phone i have now.
it seems that all the discussion lately from these "sources" is saying that the iphone 5 will basically come out in september/fall, have an 8mp camera, and have the A5 chip. and after what Cook said today, it's obvious there won't be 4G with this new phone, which i didn't really care about in the first place.
so it seems like what most are saying is that this next iphone will be a modest update from the incredible iphone 4, which i think is awesome already. so i figure, get it now, then iphone 6 next year maybe around june/july 2012, i'll just upgrade then to that.
instead of waiting another 5 or 6 months, then getting iphone 5, i could just have the iphone already for those months, then it wouldn't be bad waiting for the 6th iteration. oh well, anything's better than the phone i have now.
mangrove
Sep 2, 08:21 PM
[QUOTE=mangrove;10977725]:D:D:D
The happiest dat of
Great! :) Hope you come back and let us know how the service is and how it compares to AT&T. Which phone did you get?
Since I have an iPad that is really all I need + Verizon. Everywhere I would go where people had no reception (me too with iPhone), I would ask what carrier they use-nearly 100% said AT&T. Then in those same instances/places I would ask people those who could talk freely on their phones what carrier they used and it was like 98 out of 100 said Verizon.
That's why I switched. Got a simple phone-Samsung Haven-2 phones for $60./month, but only 450 minutes (which I never exceeded with 2 iPhones) for around $165./month.
Sure hope the iPad is Verizon compatible soon too.
The upside to having 2 dead iPhones--now we have 2 wifi iPods so all the iPhone apps work on them.:D
The happiest dat of
Great! :) Hope you come back and let us know how the service is and how it compares to AT&T. Which phone did you get?
Since I have an iPad that is really all I need + Verizon. Everywhere I would go where people had no reception (me too with iPhone), I would ask what carrier they use-nearly 100% said AT&T. Then in those same instances/places I would ask people those who could talk freely on their phones what carrier they used and it was like 98 out of 100 said Verizon.
That's why I switched. Got a simple phone-Samsung Haven-2 phones for $60./month, but only 450 minutes (which I never exceeded with 2 iPhones) for around $165./month.
Sure hope the iPad is Verizon compatible soon too.
The upside to having 2 dead iPhones--now we have 2 wifi iPods so all the iPhone apps work on them.:D
eric_n_dfw
Mar 20, 08:01 PM
I wasn't talking about DRM or iTunes.Okay, but your comment was in reply to maticus' one about the opinion that "breaking the law is breaking the law". Who was, in turn, talking about iTMS and related issues. Sorry if I lost track somewhere but I assumed you were talking about the same thing.
(L)
May 3, 09:06 PM
No one is forcing you to read or post in any of these threads. You appear to be much more emotionally invested in this than many, including myself. Or maybe your caps lock and question mark keys are stuck.
People sure get emotionally invested about the dumbest things....
Anyone who deliberately uses more than one question mark in English is not properly literate, so let's hope our friend the von Magnum's keyboard is to blame.
People sure get emotionally invested about the dumbest things....
Anyone who deliberately uses more than one question mark in English is not properly literate, so let's hope our friend the von Magnum's keyboard is to blame.
G58
Oct 18, 07:56 AM
If I thought it was Relevant to mention the people, I would have.
Steve Wozniak co founded Apple. His inventions and machines are credited with contributing significantly to the personal computer revolution of the 1970s. Indeed, he created the Apple I and Apple II. The latter gained so much popularity it eventually became one of the best selling personal computers of the 1970s and early 1980s.
But, and here's the important point, he's nothing to do with the daily running of Apple now and has contributed virtually nothing since the early days. Yet Apple, in it's second phase with Steve Jobs in charge, is redefining mobile phones - totally without Woz playing any part in the lineage that made it possible.
Andy Rubin has also founded a company. But his history is that of a man who's come up with some possibly badly timed and poorly executed ideas, and partnered with the same haphazard wisdom. He also possesses more of an employee mentality, than a visionary to whom money is attracted.
It has to be remembered that Ubuntu [that other example of open source OS 'success'] is the only 'flavour' of the computer operating system based on the Debian Linux distribution to have broken out of the geek domain into the wider market. And this is as a result of Mark Shuttleworth's patronage. Therefore, Google are to Android as Shuttleworth is to Ubuntu - patrons. This isn't how business works. This isn't how businesses make money.
When I speak of lineage, I do so with some degree of authority and experience. The old 'Deep Throat' quote: "Follow the money" embodies wisdom that seems to have escaped you, yet it's true of everything from enterprise to terrorism.
What we have with the iPhone is a genuinely useful, definable lineage that can be accurately tracked in retrospect, as well as predicted to a certain extent in terms of future performance. But don't worry, you're not alone in not recognising that. Sir Alan Sugar made the same mistake of underestimating the iPod back in as did Steve Ballmer with the iPhone, and the whole of Wall Street did with Apple.
However, we are now in the middle of Apple's iPhone play. [Not literally, but figuratively]. And this play is very very well planned, conceived and directed. So much so in fact that I can see elements of Chinese military strategy at the heart of it. [But that's a discussion for another day].
In contrast, the Android project is like a flotilla of hopeful, yet dubiously piloted little boats, setting out on what they all seem to believe is the same journey, but by the best will in the world, can't possibly be. Not only are there too many interests that need to be served, there are far too many opportunities for the 'fleet' to loose contact with each other and their market, make no money, and eventually break up.
You say: "It's very likely to happen." re numbers of Android developers and apps etc. Sure, while the water looks good, phone makers have little to lose in pushing handset to run Android, and several will, inevitably, immediately diluting any potential gain for individual manufacturers. But as soon as interest wanes, users will find lines being dropped players will drop out of the game, and support will disappear.
So, even though the Android may well be, or is possibly, EVENTUALLY capable of being, as good a mobile operating system as Apple's iPhone OS is NOW, [albeit one developed by an un-monetised network], without the benefit of what Apple brings to the party, in terms of a single identifiable and desirable hardware solution, it's not a credible alternative. It certainly isn't ever going to be a game changer.
And don't forget, we've all been buying phones from these other players for years, and found them all wanting in a vast variety of ways, no matter how varied the choice of form factors and functionality.
Finally, psychologically this choice actually proves to be an enormous negative, as is always the case. More is not less. Fewer choices actually make choosing easier. So why are people betting on the opposite to what experience tells us is true?
Your knowledge of mobile history is a bit lacking.
Good ideas come from people, not companies. Both devices have long personal histories, even though the current iPhone and Android devices only started in mid 2005.
Android was begat by Andy Rubin, who worked at Apple in 1989, then was a major player in Magic Cap (http://en.wikipedia.org/wiki/Magic_Cap), WebTV, and Danger. So there's long experience behind both iPhone and Android teams.
It's very likely to happen.
As for quoting raw numbers, they're not always useful. There's been over three quarters of a million downloads of the Android SDK. Doesn't mean that many are working on it actively. Similarly, many of those so-called "iPhone developers" are regular users who bought memberships to get beta access.
Don't get me started on the "85,000" apps. Tens of thousands are poor duplicates. That goes for all platforms:
Sometimes I wonder how many really unique apps there can be, not just variations. Someone should do a study on the topic. Would be interesting. Must be in the low thousands, if any that many.
Steve Wozniak co founded Apple. His inventions and machines are credited with contributing significantly to the personal computer revolution of the 1970s. Indeed, he created the Apple I and Apple II. The latter gained so much popularity it eventually became one of the best selling personal computers of the 1970s and early 1980s.
But, and here's the important point, he's nothing to do with the daily running of Apple now and has contributed virtually nothing since the early days. Yet Apple, in it's second phase with Steve Jobs in charge, is redefining mobile phones - totally without Woz playing any part in the lineage that made it possible.
Andy Rubin has also founded a company. But his history is that of a man who's come up with some possibly badly timed and poorly executed ideas, and partnered with the same haphazard wisdom. He also possesses more of an employee mentality, than a visionary to whom money is attracted.
It has to be remembered that Ubuntu [that other example of open source OS 'success'] is the only 'flavour' of the computer operating system based on the Debian Linux distribution to have broken out of the geek domain into the wider market. And this is as a result of Mark Shuttleworth's patronage. Therefore, Google are to Android as Shuttleworth is to Ubuntu - patrons. This isn't how business works. This isn't how businesses make money.
When I speak of lineage, I do so with some degree of authority and experience. The old 'Deep Throat' quote: "Follow the money" embodies wisdom that seems to have escaped you, yet it's true of everything from enterprise to terrorism.
What we have with the iPhone is a genuinely useful, definable lineage that can be accurately tracked in retrospect, as well as predicted to a certain extent in terms of future performance. But don't worry, you're not alone in not recognising that. Sir Alan Sugar made the same mistake of underestimating the iPod back in as did Steve Ballmer with the iPhone, and the whole of Wall Street did with Apple.
However, we are now in the middle of Apple's iPhone play. [Not literally, but figuratively]. And this play is very very well planned, conceived and directed. So much so in fact that I can see elements of Chinese military strategy at the heart of it. [But that's a discussion for another day].
In contrast, the Android project is like a flotilla of hopeful, yet dubiously piloted little boats, setting out on what they all seem to believe is the same journey, but by the best will in the world, can't possibly be. Not only are there too many interests that need to be served, there are far too many opportunities for the 'fleet' to loose contact with each other and their market, make no money, and eventually break up.
You say: "It's very likely to happen." re numbers of Android developers and apps etc. Sure, while the water looks good, phone makers have little to lose in pushing handset to run Android, and several will, inevitably, immediately diluting any potential gain for individual manufacturers. But as soon as interest wanes, users will find lines being dropped players will drop out of the game, and support will disappear.
So, even though the Android may well be, or is possibly, EVENTUALLY capable of being, as good a mobile operating system as Apple's iPhone OS is NOW, [albeit one developed by an un-monetised network], without the benefit of what Apple brings to the party, in terms of a single identifiable and desirable hardware solution, it's not a credible alternative. It certainly isn't ever going to be a game changer.
And don't forget, we've all been buying phones from these other players for years, and found them all wanting in a vast variety of ways, no matter how varied the choice of form factors and functionality.
Finally, psychologically this choice actually proves to be an enormous negative, as is always the case. More is not less. Fewer choices actually make choosing easier. So why are people betting on the opposite to what experience tells us is true?
Your knowledge of mobile history is a bit lacking.
Good ideas come from people, not companies. Both devices have long personal histories, even though the current iPhone and Android devices only started in mid 2005.
Android was begat by Andy Rubin, who worked at Apple in 1989, then was a major player in Magic Cap (http://en.wikipedia.org/wiki/Magic_Cap), WebTV, and Danger. So there's long experience behind both iPhone and Android teams.
It's very likely to happen.
As for quoting raw numbers, they're not always useful. There's been over three quarters of a million downloads of the Android SDK. Doesn't mean that many are working on it actively. Similarly, many of those so-called "iPhone developers" are regular users who bought memberships to get beta access.
Don't get me started on the "85,000" apps. Tens of thousands are poor duplicates. That goes for all platforms:
Sometimes I wonder how many really unique apps there can be, not just variations. Someone should do a study on the topic. Would be interesting. Must be in the low thousands, if any that many.
jasonph
Apr 6, 03:55 AM
The biggest thing I miss is the ALT + <somekey> to open a menu keyboard shortcut.
What I don't miss. Windows (inc 7) is slower on the same hardware than OS X. It also thrashes the hard drive with its virtual memory use in comparison to OS X and some of it's file handling is laughable. Even XP was better than Win7. I run all sorts of PC's but you really need a lot of memory, a quad core CPU and a very fast drive for win 7 to give of it's best. Not so with Mac OS X, almost any of the Intel Mac's are fine for most jobs (with the exception of Final Cut Pro maybe!).
Also Stability wise OS X is much more stable than Windows and Apps rarely crash (with the exception of MS Office when it was first released!).
As with all things Microsoft they take an idea and turn it into bloatware! Almost every MS app I have used feels bloated even Office on the Mac :(
What I don't miss. Windows (inc 7) is slower on the same hardware than OS X. It also thrashes the hard drive with its virtual memory use in comparison to OS X and some of it's file handling is laughable. Even XP was better than Win7. I run all sorts of PC's but you really need a lot of memory, a quad core CPU and a very fast drive for win 7 to give of it's best. Not so with Mac OS X, almost any of the Intel Mac's are fine for most jobs (with the exception of Final Cut Pro maybe!).
Also Stability wise OS X is much more stable than Windows and Apps rarely crash (with the exception of MS Office when it was first released!).
As with all things Microsoft they take an idea and turn it into bloatware! Almost every MS app I have used feels bloated even Office on the Mac :(
wnurse
Mar 19, 10:41 PM
That when you do things like this, it hurts apple. Apple has a market to protect. If people keep doing this enough until the RIAA gets pissed and won't let apple sell music any more. It's just like complaining that apple hass had to change their DRM policies. It's not apple that is doing it, it's pressure from the Recording Industry. Apple has to walk an extremely fine line, and they do a goo djob of it, so those folks need to lighten up.
I know this comes as a shock to you but not a lot of people care whether Apple is hurt or not. While apple fans are loyal to apple, pc fans are loyal to no one and a lot of people who would use this app are pc fans. Also not everyone who uses a mac cares about apple. After all, what do they care if apple survives?. They still get the same paycheck. It's not like if apple gets richer, we get richer. It's the same with every company. Customer loyalty is fleeting.
I know this comes as a shock to you but not a lot of people care whether Apple is hurt or not. While apple fans are loyal to apple, pc fans are loyal to no one and a lot of people who would use this app are pc fans. Also not everyone who uses a mac cares about apple. After all, what do they care if apple survives?. They still get the same paycheck. It's not like if apple gets richer, we get richer. It's the same with every company. Customer loyalty is fleeting.
citizenzen
Mar 14, 07:34 PM
The equation has to be considered in its entirety.
Did they attack your reading comprehension skills too?
The meanies. :(
Did they attack your reading comprehension skills too?
The meanies. :(
NebulaClash
Apr 29, 07:54 AM
A reasonable question, AppleScruff. Indeed, my sample group includes staff, faculty, and students from different disciplines (including business/commerce, and engineering) at a university who use their Macs for research, graduate work, or lecture preparation; a prominent cardiologist at a large hospital; a financial advisor; professional musicians; and many others.
I am myself using a Mac in a business school seamlessly among my PC-using peers. There is nothing that they can do that I cannot - and many things I can do that they would have a difficult time doing in Windows. In fact, my colleagues have been so impressed that one has already made the switch recently, and another is preparing to switch as well. Those days of "needing to run Windows" for work are behind us.
That's been my observation in the business world as well. With projects often being Web-based now, Windows is becoming irrelevant. On one project with about twenty developers, systems architects and analysts, close to half were running Macbook Pros (no Windows installed) and doing very well. It's just not an issue for many office folks. Obviously there are some roles that still require Windows, but not as many as it used to be. The tech folks in particular seem to take great delight in moving to Macs. Times have changed.
I am myself using a Mac in a business school seamlessly among my PC-using peers. There is nothing that they can do that I cannot - and many things I can do that they would have a difficult time doing in Windows. In fact, my colleagues have been so impressed that one has already made the switch recently, and another is preparing to switch as well. Those days of "needing to run Windows" for work are behind us.
That's been my observation in the business world as well. With projects often being Web-based now, Windows is becoming irrelevant. On one project with about twenty developers, systems architects and analysts, close to half were running Macbook Pros (no Windows installed) and doing very well. It's just not an issue for many office folks. Obviously there are some roles that still require Windows, but not as many as it used to be. The tech folks in particular seem to take great delight in moving to Macs. Times have changed.
roland.g
Sep 12, 06:33 PM
That's what I thought when I saw that they weren't specific about WiFi ... simply calling it "802.11 wireless networking" instead of specifically stating it was "802.11 A/B/G".
but that brings up the point of what's sending to it. Doesn't matter that it has new tech to recieve at higher bandwidth if the computer streaming to it only sends out at 802.11g.
but that brings up the point of what's sending to it. Doesn't matter that it has new tech to recieve at higher bandwidth if the computer streaming to it only sends out at 802.11g.
mulo
Apr 22, 08:00 PM
Why are most of the posters here Atheists?
proof?
proof?
Multimedia
Nov 1, 10:17 AM
Clovertons to run hot until 2007 according to:
http://www.reghardware.co.uk/2006/11/01/intel_fwives_core/Oops! This makes me change my mind about buying this Fall:
"HP, and other OEMs, should have Clovertown gear ready on the 14th. Our sources inside HP say the chip is eating between 140 watts and 150 watts..." :eek:
"Intel hopes to deliver less power hungry parts in short order. CEO Paul Otellini has talked about 50W and 80W Clovertown parts set for the early part of 2007 (http://www.reghardware.co.uk/2006/09/26/intel_quad-core_roadmap/)." :)
Guess I'm gonna have to be a little more patient a little longer in that case. That will be after MacWorld Expo toward the end of January then. Oh well. So much for immediate gratification. ;) Looks like waiting for the 8-core to ship with Leopard will jive with the cooler less power hungry monsters as well.
Thanks for bursting my bubble. :( I can get back to the business of another longer term wait similar to the wait for Santa Rosa or the mobile C2D MBP that's shipping now after 10 months of mobile CDs. At least it won't be that much longer. :cool: Looks like Clovertown Rev. B will be worth waiting for as well.
My apologies to all who were negatively infected by my extreeme enthusiasm for the first Clovertown release before I understood this new information. I can wait. I know some of you can't.
And I also may change my mind again when/if Apple releases a hot version first. Maybe they'll pass on the 150 watt models. Or perhaps they have real good cooling figured out. But I think I'd rather be ecological and buy what consumes less power anyway - especially in light of only another 2-3 months time.
http://www.reghardware.co.uk/2006/11/01/intel_fwives_core/Oops! This makes me change my mind about buying this Fall:
"HP, and other OEMs, should have Clovertown gear ready on the 14th. Our sources inside HP say the chip is eating between 140 watts and 150 watts..." :eek:
"Intel hopes to deliver less power hungry parts in short order. CEO Paul Otellini has talked about 50W and 80W Clovertown parts set for the early part of 2007 (http://www.reghardware.co.uk/2006/09/26/intel_quad-core_roadmap/)." :)
Guess I'm gonna have to be a little more patient a little longer in that case. That will be after MacWorld Expo toward the end of January then. Oh well. So much for immediate gratification. ;) Looks like waiting for the 8-core to ship with Leopard will jive with the cooler less power hungry monsters as well.
Thanks for bursting my bubble. :( I can get back to the business of another longer term wait similar to the wait for Santa Rosa or the mobile C2D MBP that's shipping now after 10 months of mobile CDs. At least it won't be that much longer. :cool: Looks like Clovertown Rev. B will be worth waiting for as well.
My apologies to all who were negatively infected by my extreeme enthusiasm for the first Clovertown release before I understood this new information. I can wait. I know some of you can't.
And I also may change my mind again when/if Apple releases a hot version first. Maybe they'll pass on the 150 watt models. Or perhaps they have real good cooling figured out. But I think I'd rather be ecological and buy what consumes less power anyway - especially in light of only another 2-3 months time.
sth
Apr 13, 04:20 AM
Some pro-style questions that have been left unanswered
Some of those questions actually were answered (for example that full keyboard control has been retained) and others are more or less no-brainers (like the stabilization question - you can enable/disable and even fine-tune that even in the dumbed-down iMovie, so why shouldn't you be able to do that in Final Cut).
Some of those questions actually were answered (for example that full keyboard control has been retained) and others are more or less no-brainers (like the stabilization question - you can enable/disable and even fine-tune that even in the dumbed-down iMovie, so why shouldn't you be able to do that in Final Cut).
Bill McEnaney
Mar 26, 10:53 PM
It was not a Latin sentence, so it was certainly meaningless in Latin. If you look up "sign", as a noun meaning signification, and instead choose the first person singular of the Latin verb meaning "sign a letter", you are not off to a very promising start. Cicero would be rolling in his grave.
I know the difference between a sign and what it signifies. But even if a group of words doesn't form a sentence, that group differs from the proposition the writer is trying to state with it. That's why you can translate a sentence from one language to another language. If I'm only beginning to learn French, I may say something that may be ungrammatical literally meaningless. But my teacher or another expert in the French language may know what it is I'm trying to say with it. Skunk seems to be talking mostly about a signifier, the group of words, when I'm talking mostly about what Caocao intended to signify with it. When someone says something I don't understand, I'll ask the speaker or the to rephrase it. I just realized that I misspelled CaoCao's screen name. But I'm sure you guys knew whom the misspelled name stood for.
I know the difference between a sign and what it signifies. But even if a group of words doesn't form a sentence, that group differs from the proposition the writer is trying to state with it. That's why you can translate a sentence from one language to another language. If I'm only beginning to learn French, I may say something that may be ungrammatical literally meaningless. But my teacher or another expert in the French language may know what it is I'm trying to say with it. Skunk seems to be talking mostly about a signifier, the group of words, when I'm talking mostly about what Caocao intended to signify with it. When someone says something I don't understand, I'll ask the speaker or the to rephrase it. I just realized that I misspelled CaoCao's screen name. But I'm sure you guys knew whom the misspelled name stood for.
dmelgar
Jul 8, 12:01 AM
I'm still on the iPhone 3G. I was seriously considering ditching AT&T and the iPhone 4 for Sprint/EVO 4G or Verizon/Droid X because I was dropping call every single call in my house (no joke, every single call) and multiple calls per day around town a few weeks ago, BUT...
I haven't dropped a call for a couple of weeks now and have had great reception in my house recently, as well. Really odd, but encouraging as I decide what you do about replacing this phone.
I've had poor AT&T coverage as well. Could never reliably make a call at home. After suffering for 2 years I ditched and went with the HTC Incredible on Verizon. Couldn't be happier. 3G coverage EVERYWHERE. I forgot what it was like to have coverage. Its like day and night. Over the last month, I've grown to really like Android as well.
I haven't dropped a call for a couple of weeks now and have had great reception in my house recently, as well. Really odd, but encouraging as I decide what you do about replacing this phone.
I've had poor AT&T coverage as well. Could never reliably make a call at home. After suffering for 2 years I ditched and went with the HTC Incredible on Verizon. Couldn't be happier. 3G coverage EVERYWHERE. I forgot what it was like to have coverage. Its like day and night. Over the last month, I've grown to really like Android as well.

superleccy
Sep 20, 07:18 AM
Is it possible that the cable ports on the back can be used for both input AND output? I don't see why not.
Well, the shape of the USB port suggests that it is for attaching another device to the iTV, and not for attaching the iTV to your Mac.
If the iTV doubles-up as an Airport Express, then maybe the USB port is for attaching a printer.
SL
Well, the shape of the USB port suggests that it is for attaching another device to the iTV, and not for attaching the iTV to your Mac.
If the iTV doubles-up as an Airport Express, then maybe the USB port is for attaching a printer.
SL
RichP
Oct 26, 12:58 PM
Just thought I'd put in my piece of advice about DVI-DL KVM switches. I'm only aware of three of them on the market, the two most common are from Gefen (www.gefen.com). I'm using the 4x1 Gefen and it works perfectly switching my primary display between my G5 quad, two PCs and my MBP. I know the quad switch is double the price, but DO NOT BUY THE 2x1 DVI-DL SWITCH from Gefen!!!
Darn it! That is just stupid. I have a gefen DVI switch now, its sad to hear that the 2x1 is junk. Its not worth it to me for the 4x1, either 900 for a switch, for for 1280, (a few hundred more) I get ANOTHER 30"!
I wish the apple 23s just had the quality of the 20 and 30. :mad:
Darn it! That is just stupid. I have a gefen DVI switch now, its sad to hear that the 2x1 is junk. Its not worth it to me for the 4x1, either 900 for a switch, for for 1280, (a few hundred more) I get ANOTHER 30"!
I wish the apple 23s just had the quality of the 20 and 30. :mad:
Mac'nCheese
Apr 24, 10:04 AM
I figured I'd use this wonderful Easter Sunday (a day spent celebrating the beginning of Spring and absolutely nothing else), to pose a question that I have.... What's the deal with religious people? After many a spirited thread about religion, I still can't wrap my head around what keeps people in the faith nowadays. I'm not talking about those people in third world nations, who have lived their entire lives under religion and know of nothing else. I'm talking about your Americans (North and South), your Europeans, the people who have access to any information they want to get (and some they don't) who should know better by now. And yet, in thread after thread, these people still swear that their way is the only way. No matter what logic you use, they can twist the words from their holy books and change the meaning of things to, in their minds, completely back up their point of view. Is it stubbornness, the inability to admit that you were wrong about something so important for so long? Is it fear? If I admit this is BS, I go to hell? Simple ignorance? Please remember, I'm not talking about just believing in a higher power, I mean those who believe in religion, Jews, Christian, etc.
bpaluzzi
Apr 28, 09:00 AM
Best thing I could find
http://www.pewinternet.org/Reports/2010/Gadgets/Report/Desktop-and-Laptop-Computers.aspx
Kudos for looking for something (seriously) -- I'd argue that it's a bit limited in scope, though:
-Limited to America
-Limited to adults
-Calculating by household, with strictly boolean "yes or no" (not counting multiples)
For example, in my house, we have 4 laptops and 1 desktop machine, but for this survey, it would only be counted as "yes" for both. Actually, it wouldn't be counted at all, since we're in England ;-)
http://www.pewinternet.org/Reports/2010/Gadgets/Report/Desktop-and-Laptop-Computers.aspx
Kudos for looking for something (seriously) -- I'd argue that it's a bit limited in scope, though:
-Limited to America
-Limited to adults
-Calculating by household, with strictly boolean "yes or no" (not counting multiples)
For example, in my house, we have 4 laptops and 1 desktop machine, but for this survey, it would only be counted as "yes" for both. Actually, it wouldn't be counted at all, since we're in England ;-)
sth
Apr 13, 04:20 AM
Some pro-style questions that have been left unanswered
Some of those questions actually were answered (for example that full keyboard control has been retained) and others are more or less no-brainers (like the stabilization question - you can enable/disable and even fine-tune that even in the dumbed-down iMovie, so why shouldn't you be able to do that in Final Cut).
Some of those questions actually were answered (for example that full keyboard control has been retained) and others are more or less no-brainers (like the stabilization question - you can enable/disable and even fine-tune that even in the dumbed-down iMovie, so why shouldn't you be able to do that in Final Cut).
matticus008
Mar 20, 09:01 PM
As I understand it, the issue of using music in your wedding video has nothing to do with breaking DRM, but instead with violating copyright. Even you get the music off of a CD, it would still be illegal.
That was a poor example, I admit. The wedding video situation is fairly complicated, depending on whether you're selling the video (which doesn't seem to be the case) and on the manner in which the song is used. If the song is played in the background by a DJ and it winds up in your video, there's not really an issue. Putting it in in the editing process would fall under fair use for private viewing, but because it's something you're sending out, I can't say off the top of my head whether this is also fair use. You are protected under the law for making mix tapes and CDs, even if you give them away in small numbers. If you make a wedding video and send out two or three copies, I believe this is still considered private viewing. If you send out the video to more than a handful of wedding guests, then you are redistributing and have to obtain permission.
That was a poor example, I admit. The wedding video situation is fairly complicated, depending on whether you're selling the video (which doesn't seem to be the case) and on the manner in which the song is used. If the song is played in the background by a DJ and it winds up in your video, there's not really an issue. Putting it in in the editing process would fall under fair use for private viewing, but because it's something you're sending out, I can't say off the top of my head whether this is also fair use. You are protected under the law for making mix tapes and CDs, even if you give them away in small numbers. If you make a wedding video and send out two or three copies, I believe this is still considered private viewing. If you send out the video to more than a handful of wedding guests, then you are redistributing and have to obtain permission.
slinger1968
Nov 3, 09:45 PM
I wrote that whole scenario to refute your opinion Software is behind Hardware and show that the opposite is true.Well try reading what you are responding to, before you get your panties in a bunch. I was clearly talking about most software for the masses, not all software. Most software is currently behind the hardware because most software is not written for more than 2 cores yet.
They aren't. That's my whole point.Well, You are wrong, most software is behind the current hardware. The hardware is only still weak for a small niche market of power users. You are a power user but the majority of people out there, especially iMac buyers are not using their computers for the same tasks. Read any of the computer hardware sites and the reviews on the quad core processors. They all say that these are currently enthusiast or power level parts not aimed at the general consumer.
They aren't because they can't because the hardware is too weak. That was the entire point of my above post. That's why all these 8, 16 and then 32 core processors are so needed ASAP.The hardware is only weak for a small niche group of power users. It's rediculous to think that the average user is doing 3D modeling or high powered video processing. It's just silly.
I have a dedicated bittorrent/music playing computer for live uncopywritten music. I've downloaded/uploaded over 1 terabyte of data and have specific computing needs for this. I'm just smart enough to recognize that my usage isn't normal.
Again, Read any of the computer hardware sites and the reviews on the quad core processors. They all say that these are currently enthusiast or power level parts not aimed at the general consumer.
They aren't. That's my whole point.Well, You are wrong, most software is behind the current hardware. The hardware is only still weak for a small niche market of power users. You are a power user but the majority of people out there, especially iMac buyers are not using their computers for the same tasks. Read any of the computer hardware sites and the reviews on the quad core processors. They all say that these are currently enthusiast or power level parts not aimed at the general consumer.
They aren't because they can't because the hardware is too weak. That was the entire point of my above post. That's why all these 8, 16 and then 32 core processors are so needed ASAP.The hardware is only weak for a small niche group of power users. It's rediculous to think that the average user is doing 3D modeling or high powered video processing. It's just silly.
I have a dedicated bittorrent/music playing computer for live uncopywritten music. I've downloaded/uploaded over 1 terabyte of data and have specific computing needs for this. I'm just smart enough to recognize that my usage isn't normal.
Again, Read any of the computer hardware sites and the reviews on the quad core processors. They all say that these are currently enthusiast or power level parts not aimed at the general consumer.
KnightWRX
May 2, 05:51 PM
Until Vista and Win 7, it was effectively impossible to run a Windows NT system as anything but Administrator. To the point that other than locked-down corporate sites where an IT Professional was required to install the Corporate Approved version of any software you need to do your job, I never knew anyone running XP (or 2k, or for that matter NT 3.x) who in a day-to-day fashion used a Standard user account.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
Of course, I don't know of any Linux distribution that doesn't require root to install system wide software either. Kind of negates your point there...
In contrast, an "Administrator" account on OS X was in reality a limited user account, just with some system-level privileges like being able to install apps that other people could run. A "Standard" user account was far more usable on OS X than the equivalent on Windows, because "Standard" users could install software into their user sandbox, etc. Still, most people I know run OS X as Administrator.
You could do the same as far back as Windows NT 3.1 in 1993. The fact that most software vendors wrote their applications for the non-secure DOS based versions of Windows is moot, that is not a problem of the OS's security model, it is a problem of the Application. This is not "Unix security" being better, it's "Software vendors for Windows" being dumber.
It's no different than if instead of writing my preferences to $HOME/.myapp/ I'd write a software that required writing everything to /usr/share/myapp/username/. That would require root in any decent Unix installation, or it would require me to set permissions on that folder to 775 and make all users of myapp part of the owning group. Or I could just go the lazy route, make the binary 4755 and set mount opts to suid on the filesystem where this binary resides... (ugh...).
This is no different on Windows NT based architectures. If you were so inclined, with tools like Filemon and Regmon, you could granularly set permissions in a way to install these misbehaving software so that they would work for regular users.
I know I did many times in a past life (back when I was sort of forced to do Windows systems administration... ugh... Windows NT 4.0 Terminal Server edition... what a wreck...).
Let's face it, Windows NT and Unix systems have very similar security models (in fact, Windows NT has superior ACL support out of the box, akin to Novell's close to perfect ACLs, Unix is far more limited with it's read/write/execute permission scheme, even with Posix ACLs in place). It's the hoops that software vendors outside the control of Microsoft made you go through that forced lazy users to run as Administrator all the time and gave Microsoft such headaches.
As far back as I remember (when I did some Windows systems programming), Microsoft was already advising to use the user's home folder/the user's registry hive for preferences and to never write to system locations.
The real differenc, though, is that an NT Administrator was really equivalent to the Unix root account. An OS X Administrator was a Unix non-root user with 'admin' group access. You could not start up the UI as the 'root' user (and the 'root' account was disabled by default).
Actually, the Administrator account (much less a standard user in the Administrators group) is not a root level account at all.
Notice how a root account on Unix can do everything, just by virtue of its 0 uid. It can write/delete/read files from filesystems it does not even have permissions on. It can kill any system process, no matter the owner.
Administrator on Windows NT is far more limited. Don't ever break your ACLs or don't try to kill processes owned by "System". SysInternals provided tools that let you do it, but Microsoft did not.
All that having been said, UAC has really evened the bar for Windows Vista and 7 (moreso in 7 after the usability tweaks Microsoft put in to stop people from disabling it). I see no functional security difference between the OS X authorization scheme and the Windows UAC scheme.
UAC is simply a gui front-end to the runas command. Heck, shift-right-click already had the "Run As" option. It's a glorified sudo. It uses RDP (since Vista, user sessions are really local RDP sessions) to prevent being able to "fake it", by showing up on the "console" session while the user's display resides on a RDP session.
There, you did it, you made me go on a defensive rant for Microsoft. I hate you now.
My response, why bother worrying about this when the attacker can do the same thing via shellcode generated in the background by exploiting a running process so the the user is unaware that code is being executed on the system
Because this required no particular exploit or vulnerability. A simple Javascript auto-download and Safari auto-opening an archive and running code.
Why bother, you're not "getting it". The only reason the user is aware of MACDefender is because it runs a GUI based installer. If the executable had had 0 GUI code and just run stuff in the background, you would have never known until you couldn't find your files or some chinese guy was buying goods with your CC info, fished right out of your "Bank stuff.xls" file.
That's the thing, infecting a computer at the system level is fine if you want to build a DoS botnet or something (and even then, you don't really need privilege escalation for that, just set login items for the current user, and run off a non-privilege port, root privileges are not required for ICMP access, only raw sockets).
These days, malware authors and users are much more interested in your data than your system. That's where the money is. Identity theft, phishing, they mean big bucks.
No comments:
Post a Comment