jefhatfield
Oct 12, 12:39 PM
Originally posted by snoopy
True for many of us. For applications that use a lot of math functions, it makes a big difference. So, for others it does matter. They may be in the minority, but a very important group of users. In less than a year the picture will change, and that small group will be very pleased with the Mac. For now, there is nothing anyone can do about it.
those math functions are extremely complex and hard to do fast if we stay way behind the curve of the pc world
i was in this computer repair class where we had to do the math, some of the math that a processor did, so we could appreciate that little thing
in the old days of computing, way back when in the 1970s, many computing funtions had to be done by phd mathematicians and there were very few silicon "math co-processors"
early computer science college programs were thus a lot like math programs...it's so funny, actually sad, to see how many older, math literate techies were completely unable to relate when gui came along...it was like the great slaughter in silicon valley...we take the mouse and gui for granted but not only did it take away jobs, it also was a curve ball many inflexible older techies could not adjust to
change is never easy in the IT field and that is why it is rare to see anybody go from mathematician with vacuum tubes to green screen coder to gui to "whatever" the future holds
i also had a friend who had memorized hundreds of key combinations like ctrl-a and such and he only just learned to use the mouse two years ago...he took literally five years to learn how to use it with its two buttons...he could never remember, "was that right click, left click, double click, and where do i keep my fingers?"
i could go on with old man stories from the trenches of san jose, but i will stop NOW ;)
if you started with a mouse, it only takes a few weeks to learn how to interact with windows and modern computers
one family friend, a computer professor at stanford, never got used to gui and he still uses his trusty 286...he says he can't think when there is more than one color on the screen and he never got used to the mouse
kind of the way i feel like when i use "hex-pee" or i try to play a game console thingy like x-box with all those buttons...as a ten year old yanks the keypad/console from me at the computer store and memorizes the keys and buttons within minutes as it relates to that game being played
:p
True for many of us. For applications that use a lot of math functions, it makes a big difference. So, for others it does matter. They may be in the minority, but a very important group of users. In less than a year the picture will change, and that small group will be very pleased with the Mac. For now, there is nothing anyone can do about it.
those math functions are extremely complex and hard to do fast if we stay way behind the curve of the pc world
i was in this computer repair class where we had to do the math, some of the math that a processor did, so we could appreciate that little thing
in the old days of computing, way back when in the 1970s, many computing funtions had to be done by phd mathematicians and there were very few silicon "math co-processors"
early computer science college programs were thus a lot like math programs...it's so funny, actually sad, to see how many older, math literate techies were completely unable to relate when gui came along...it was like the great slaughter in silicon valley...we take the mouse and gui for granted but not only did it take away jobs, it also was a curve ball many inflexible older techies could not adjust to
change is never easy in the IT field and that is why it is rare to see anybody go from mathematician with vacuum tubes to green screen coder to gui to "whatever" the future holds
i also had a friend who had memorized hundreds of key combinations like ctrl-a and such and he only just learned to use the mouse two years ago...he took literally five years to learn how to use it with its two buttons...he could never remember, "was that right click, left click, double click, and where do i keep my fingers?"
i could go on with old man stories from the trenches of san jose, but i will stop NOW ;)
if you started with a mouse, it only takes a few weeks to learn how to interact with windows and modern computers
one family friend, a computer professor at stanford, never got used to gui and he still uses his trusty 286...he says he can't think when there is more than one color on the screen and he never got used to the mouse
kind of the way i feel like when i use "hex-pee" or i try to play a game console thingy like x-box with all those buttons...as a ten year old yanks the keypad/console from me at the computer store and memorizes the keys and buttons within minutes as it relates to that game being played
:p
Demoman
Jul 12, 09:27 PM
They are , you will not see any performance differences between Merom, Conroe and Woodcrest at equal clock speeds, unless u go SMP. They will all encode , render , transcode at the same pace. The FSB means nothis as it has yet to be saturated even a 667mhz. Tons of test and benchmarks at Xtremesystems done over the past few months have proven this.
Making the MAcPro line all Dual will be a Big Mistake and will backfire on Apple and force many pople to go right back to PC. I can Promise you , if u want a Woody in a MacPro be prepared to pay an entry fee of $2499 to join this exclusive club of idiots.
I remeber when my iMac G4 was starting to show it'sa age and when the time came to replace it , the minimum price for a real desktop Mac was (and still is) $1999 for a dual 2.0ghz G5. So what did i do , I said goodbye Apple and built a better machine for 1/2 the money. Till this day I have no regrets and would never go back unless i was in the market for a notebook then i'd get a macbook.
I still can't believe Apple still has the balls to charge $2000 for an outdated Desktop that gets Outperformed by an $800 PC. While still having a smaller hard drive , less ram , less usb ports , no card reader. Jobs believes you mac loyalist are stupid.
Believe me Bro i've already been there.:D
Does not sound like you have been anywhere. Whether the entire line of PM's need to be SMP is a question for someone close to the sales data. I find your assuming everyone want to use a computer like you very arrogant and simple-minded. Why are you even on this website? If you hold Apple in such distain, why not go find a place where you can bond with other folk who have only achieved the same level of computer knowledge and manners as you.
Making the MAcPro line all Dual will be a Big Mistake and will backfire on Apple and force many pople to go right back to PC. I can Promise you , if u want a Woody in a MacPro be prepared to pay an entry fee of $2499 to join this exclusive club of idiots.
I remeber when my iMac G4 was starting to show it'sa age and when the time came to replace it , the minimum price for a real desktop Mac was (and still is) $1999 for a dual 2.0ghz G5. So what did i do , I said goodbye Apple and built a better machine for 1/2 the money. Till this day I have no regrets and would never go back unless i was in the market for a notebook then i'd get a macbook.
I still can't believe Apple still has the balls to charge $2000 for an outdated Desktop that gets Outperformed by an $800 PC. While still having a smaller hard drive , less ram , less usb ports , no card reader. Jobs believes you mac loyalist are stupid.
Believe me Bro i've already been there.:D
Does not sound like you have been anywhere. Whether the entire line of PM's need to be SMP is a question for someone close to the sales data. I find your assuming everyone want to use a computer like you very arrogant and simple-minded. Why are you even on this website? If you hold Apple in such distain, why not go find a place where you can bond with other folk who have only achieved the same level of computer knowledge and manners as you.
GGJstudios
Apr 9, 12:52 PM
If we're talking laptops, then depending on the model you buy, some may also have heating issues that other brands will not. If we're talking PC desktops, hopefully you've built your own, but if you didn't you can install more fans, a better heatsink, better thermal paste, etc. without voiding your warranty. Last time I checked, if you open your Mac, it voids your warranty.
The fact that a Mac notebook normally runs high temps is not a flaw, or "issue" or problem. They are designed to run at such temps. The fact that those who are new to Mac are unfamiliar with this doesn't make it a flaw. They just need to adjust their thinking. And no, simply opening a Mac doesn't void the warranty. For example, replacing/updating RAM and hard drives doesn't void the warranty.
The fact that a Mac notebook normally runs high temps is not a flaw, or "issue" or problem. They are designed to run at such temps. The fact that those who are new to Mac are unfamiliar with this doesn't make it a flaw. They just need to adjust their thinking. And no, simply opening a Mac doesn't void the warranty. For example, replacing/updating RAM and hard drives doesn't void the warranty.
archipellago
May 2, 04:32 PM
Such a load of crap that is.
'we've interviewed hackers after conviction'
:rolleyes:
I work for one of the biggest bank in the world and specialise in bank fraud, we liaise with the major law enforcement group all over the world.
Cutting a deal with a hacker, if we can get one who's up high enough can save millions....with the right info.
mac users tend to be socially engineered via simpler methods anyway, wonder why that is...? :rolleyes:
'we've interviewed hackers after conviction'
:rolleyes:
I work for one of the biggest bank in the world and specialise in bank fraud, we liaise with the major law enforcement group all over the world.
Cutting a deal with a hacker, if we can get one who's up high enough can save millions....with the right info.
mac users tend to be socially engineered via simpler methods anyway, wonder why that is...? :rolleyes:
mac1984user
Apr 15, 10:17 AM
If the media shouldn't project a positive message about being gay, then they shouldn't project a positive message about being straight. No more kissing on TV, film, etc. Ban all public displays of affection and don't say a word about issues that someone might take 'offence' to. Yeah...that sounds like a great world. Ugh...please.
ccrandall77
Sep 12, 03:24 PM
This was the product I was really waiting for. It's cool, but I'm disappointed it doesn't have a DVD player and it looks like it probably won't work with EyeTV. So, I guess I'm sticking with my MacMini on the TV and hoping for a better version of FrontRow soon.
Of course, if this is like the Airport Express and can be used as a WiFi router (and hopefully it's 802.11n compliant), then I could see moving the iMac out of the bedroom and just hooking this baby up to a small LCD TV. I can always use Handbrake to create a streamable video file.
Of course, if this is like the Airport Express and can be used as a WiFi router (and hopefully it's 802.11n compliant), then I could see moving the iMac out of the bedroom and just hooking this baby up to a small LCD TV. I can always use Handbrake to create a streamable video file.
AppliedVisual
Oct 26, 10:34 AM
Considering that Windows supports up to 64 CPU cores, and that 64 core Windows machines are available - it would be nice if you could show some proof that OSX on a 64 CPU machine scales better than Windows or Linux....
Are you being overly pedantic or do you just want to argue? I said WinXP. -- "probably as good or better than WinXP". WinXP only supports two CPUs with a max of 4 cores each right now as per the EULA. The Windows kernel itself actually handles CPU division and scales dynamically based on addressable CPUs within a system all the way up to 256 CPUs or cores, with support for up to 4 logical or virtual CPUs each. And just think where those 64-CPU Windows systems are going to be in the near future as they're updraded with quad-core CPUs from AMD/Intel...
BTW: You have to buy Windows Server Datacenter Edition to get to all those CPUs.
Are you being overly pedantic or do you just want to argue? I said WinXP. -- "probably as good or better than WinXP". WinXP only supports two CPUs with a max of 4 cores each right now as per the EULA. The Windows kernel itself actually handles CPU division and scales dynamically based on addressable CPUs within a system all the way up to 256 CPUs or cores, with support for up to 4 logical or virtual CPUs each. And just think where those 64-CPU Windows systems are going to be in the near future as they're updraded with quad-core CPUs from AMD/Intel...
BTW: You have to buy Windows Server Datacenter Edition to get to all those CPUs.
jav6454
Mar 18, 01:45 AM
Option 3; STOP trying to cheat the system, and START using your iDevice the way the manufacturer designed it and the way your carrier supports it. (Is it unfair? YES! Are all of us iPhone users getting hosed, even though there's now two carriers? YES)
And while you're at it, knock off the piracy with the napster/limewire/torrent crap.
(Yeah, I said it! SOMEBODY had to!)
Poor thing... he doesn't realize napster and limewire are history. Also, once the data hits my device, it's mine to do with as I please. Thank you very much.
>laughing_girls.jpg.tiff.
And while you're at it, knock off the piracy with the napster/limewire/torrent crap.
(Yeah, I said it! SOMEBODY had to!)
Poor thing... he doesn't realize napster and limewire are history. Also, once the data hits my device, it's mine to do with as I please. Thank you very much.
>laughing_girls.jpg.tiff.
r.j.s
May 2, 11:26 AM
You mean like the OS X pop up that asks for your password for the umpteenth time ? ;)
Users are as conditioned to just enter it on OS X as they are on clicking Allow on Windows.
Huge difference in my experience. The Windows UAC will pop up for seemingly mundane things like opening some files or opening applications for the first time, where as the OS X popup only happens during install of an app - in OS X, there is an actual logical reason apparent to the user. It is still up to the user to ensure the software they are installing is from a trusted source, but the reason for the password is readily apparent.
Users are as conditioned to just enter it on OS X as they are on clicking Allow on Windows.
Huge difference in my experience. The Windows UAC will pop up for seemingly mundane things like opening some files or opening applications for the first time, where as the OS X popup only happens during install of an app - in OS X, there is an actual logical reason apparent to the user. It is still up to the user to ensure the software they are installing is from a trusted source, but the reason for the password is readily apparent.
skunk
Mar 27, 07:10 PM
Meanwhile, please listen to Nicolosi's first answer in video 3 of the first set of videos, the last part of the three-part interview, where he says that homosexuals have a right to live a gay lifestyleHomosexuals have a right to live the same lifestyle as anybody else, under the Constitution and under the UN Declaration.
Maybe with better furnishings, though...
Maybe with better furnishings, though...
Jamieserg
Apr 13, 12:35 PM
I like the new Final Cut Interface the old one was getting tired. Plus rendering in the background will save me SO much time. A lot of my time is spent hitting cmd+r at the moment. Looks like a brilliant release but as always i'll save my final judgement for when I get my hands on it.
bartzilla
Apr 20, 08:17 AM
One thing I would say, as someone who didn't "switch" but who uses both quite comfortably, is that you need to appreciate how the system works and try and work with it rather than against it, so rather than saying "This is how I used to do things in Windows, now what can I do on a Mac that's similar to the way I used to do it in Windows" you need to think about what you're trying to achieve and find out what neat ways the mac has of getting that done.
This goes both ways, trying to use Windows as if it was Mac OSX isn't much fun, either.
This goes both ways, trying to use Windows as if it was Mac OSX isn't much fun, either.
leekohler
Apr 15, 12:00 PM
No but hold on a second. I don't know what scientific evidence has to say about something like morality. It may certainly be that sexuality is immutable. But if you're referring to my quote from the Catechism (and I lost track)... that doesn't say homosexuals are required to change their sexuality.
Yeah, but it sure says that we won't get far if we act like what we are.
Again, completely irrelevant, since this is not catholic theocracy, and your book has no bearing on my life. If that's how you choose to believe, fine- but leave me out of it, and stop trying to force us to live by your rules through law. And guess what? We'll get along just fine.
Yeah, but it sure says that we won't get far if we act like what we are.
Again, completely irrelevant, since this is not catholic theocracy, and your book has no bearing on my life. If that's how you choose to believe, fine- but leave me out of it, and stop trying to force us to live by your rules through law. And guess what? We'll get along just fine.
edifyingGerbil
Apr 27, 03:04 PM
I'm afraid you are.
The Hebrew god is the same god as in polytheistic days, but once he had conquered all his fellow gods, he was left with unrivalled power. The Hebrew religion became monotheistic, and their new old god acquired sole power, but the root of the deity was no more or less than a shared and ancient mythology.
But these arguments don't refer to God as being derived from El, the arguments can only work if "God" is shorthand for "the entity described in the Judaeo-Christian Biblical texts".
The fact he is described on tablets in Ugarit doesn't matter for the purposes of ontological arguments that try to answer does "God" (the Judaeo-Christian God) exist?
This was my point, waaay back, about why I use the Judaeo-Christian God as opposed to god. Someone took umbrage at my use of Judaeo-Christian.
The Hebrew god is the same god as in polytheistic days, but once he had conquered all his fellow gods, he was left with unrivalled power. The Hebrew religion became monotheistic, and their new old god acquired sole power, but the root of the deity was no more or less than a shared and ancient mythology.
But these arguments don't refer to God as being derived from El, the arguments can only work if "God" is shorthand for "the entity described in the Judaeo-Christian Biblical texts".
The fact he is described on tablets in Ugarit doesn't matter for the purposes of ontological arguments that try to answer does "God" (the Judaeo-Christian God) exist?
This was my point, waaay back, about why I use the Judaeo-Christian God as opposed to god. Someone took umbrage at my use of Judaeo-Christian.
Phil A.
Aug 29, 02:51 PM
The one thing that struck me on the report is the amount of marks given to companies who have committed to a timescale. For example, Apple have committed to removing all BFRs but given no timescale and are marked as "bad". Dell have committed to removing all BFRs by 2009 and are marked "Good". Don't get me wrong, it's good that companies are giving time scales, but they don't really mean jack until they're implemented (the UK committed to the Kyoto protocol and will miss it's commitments by miles), and I think it's a bit misleading to give any company full marks simply because they have given a date that may be missed. I would have preferred to see those marked as Partially Good because clearly a commitment isn't as good as actually delivering on promises.
manic
Jul 12, 09:05 AM
Okay, people are hyped about the 4 core xeon. But arent we overlooking something here? Arent server processors designed to do substantially different work than desktops? Whats the point in fitting a >1000 dollar processor into a machine that runs photoshop and see it slug away? Im not saying thats the case, but I think its a relevant point and would like to know if anyone knows the answer. If its slower at desktop tasks, than we will be seeing conroes in mac pros. If its faster, then theres a pretty good chance it will fit the highest end one.
now, unless the other chap who said "anything other than woodcrest would be absolutely insulting" knows wc is insanely faster at desktop tasks, I think hes just building some negative hype. Conroes are supposed to outperform by a wide margin everything weve seen so far. Its by no means insulting
now, unless the other chap who said "anything other than woodcrest would be absolutely insulting" knows wc is insanely faster at desktop tasks, I think hes just building some negative hype. Conroes are supposed to outperform by a wide margin everything weve seen so far. Its by no means insulting
iJohnHenry
Apr 22, 09:04 PM
I would suggest that most Apple users are willing to look "outside the box", and not be bound by pre-conceived "notions".
paul4339
Apr 21, 12:17 AM
It skews the number non the less. iOS is on four different devices the iTv, iPod touch, iphone, and the ipod touch jumbo. And google doesn't make any hardware. They work with companies to have them made like the nexus series.
The comScore data tracks the number of users ... so if you use four idevices, it's still counted as one user... at least that's what the article mentions.
The comScore data tracks the number of users ... so if you use four idevices, it's still counted as one user... at least that's what the article mentions.
�algiris
May 2, 09:30 AM
How stupid does a user needs to be in order to install, run and then enter credit card information into an application that pops up by itself?
:eek:
Indeed. He (user in general) can be running NASA mainframe, but if he's dumbass nothing will help.
:eek:
Indeed. He (user in general) can be running NASA mainframe, but if he's dumbass nothing will help.
Mister Snitch
Apr 9, 11:46 AM
I am firmly against poaching executives. They should always be deep-fried.
kdarling
Jun 1, 12:36 AM
Ok just to reference your statement about data using seperate channels and what not I guess you are not privy to the technology used in cell towers, congestion is caused as a cell tower can only handle so many requests, DATA or VOICE.....
Fortunately, it doesn't work that way.
A common mistake is in thinking that an IP based backhaul means voice calls don't get dedicated resources. However, carriers use TDM and/or pseudo-wire circuits to make sure that voice calls get all the QoS they need.
Data has to share the remaining bandwidth and is what is subject to congestion.
So fyi Data requests can congest and cause problems with voice even on the Un Touched Super Squeeky Clean power known as Verizon's network.....
No. See above. Data loads alone should not cause problems with voice due to limited backhaul on either Verizon or AT&T. Data especially cannot cause a voice problem on Verizon because it's transmitted on separate channels.
Data can (and does) cause dropped voice calls on AT&T because GSM 3G shares the same channel for data and voice (thus allowing their simultaneous use). Data transmissions can affect voice calls, and vice versa. This is because more 3G voice or data users cause a cell's effective radius to shrink, and marginal users will often get dropped. So a new data user can drop voice users on AT&T.
Another problem with GSM 3G is that if you're on a voice call and then use data simultaneously, the phone+network has to drop the voice connection and reconnect instantly as a combined data call, which can fail. You might not even know the phone is trying to do this in the background for push email or notifications data. All you know is that your voice call dropped. (Which is why some people stick to EDGE, which does not support simultaneous comms.)
I get dropped calls constantly. I'd say it's approaching 50% of the time. I am not even in a rural area at all. My phone will say 3-4 bars and then when I go to make a call, it drops down to 0-1 bars. I just turned in on, just now and it showed 4 bars, and then it dropped to 2 bars immediately. I think their software is trying to be optimistic or something. It's like magic!
GSM uses a form of CDMA called WCDMA for 3G.
(W)CDMA works by having every phone talking at once, just like picking out a voice in a crowd in a noisy room. The more phones talking to a cell, the louder everyone has to talk to be heard. The overall signal level doesn't matter, but only the usable ratio of your own signal levels to the noise floor.
If a phone displayed this ratio, it would fluctuate wildly as users come and go. So idle phones usually display the steady power level of a transmitted pilot channel from the tower instead. Basically, the closer you are, the higher the level, which a user can understand.
Once you connect, the phone can actually determine the connection quality because then it knows its communication error rate. That's why the bars will fluctuate after connection.
Your phone could show only one bar of pilot signal, but still get a great connection if you're the only one using that cell. Or you could have full bars of pilot signal, but a terrible connection if you're sharing the cell with too many others.
So bars are basically meaningless until connected, and even then only show the quality incoming to the phone, not how well you transmit to the tower.
Fortunately, it doesn't work that way.
A common mistake is in thinking that an IP based backhaul means voice calls don't get dedicated resources. However, carriers use TDM and/or pseudo-wire circuits to make sure that voice calls get all the QoS they need.
Data has to share the remaining bandwidth and is what is subject to congestion.
So fyi Data requests can congest and cause problems with voice even on the Un Touched Super Squeeky Clean power known as Verizon's network.....
No. See above. Data loads alone should not cause problems with voice due to limited backhaul on either Verizon or AT&T. Data especially cannot cause a voice problem on Verizon because it's transmitted on separate channels.
Data can (and does) cause dropped voice calls on AT&T because GSM 3G shares the same channel for data and voice (thus allowing their simultaneous use). Data transmissions can affect voice calls, and vice versa. This is because more 3G voice or data users cause a cell's effective radius to shrink, and marginal users will often get dropped. So a new data user can drop voice users on AT&T.
Another problem with GSM 3G is that if you're on a voice call and then use data simultaneously, the phone+network has to drop the voice connection and reconnect instantly as a combined data call, which can fail. You might not even know the phone is trying to do this in the background for push email or notifications data. All you know is that your voice call dropped. (Which is why some people stick to EDGE, which does not support simultaneous comms.)
I get dropped calls constantly. I'd say it's approaching 50% of the time. I am not even in a rural area at all. My phone will say 3-4 bars and then when I go to make a call, it drops down to 0-1 bars. I just turned in on, just now and it showed 4 bars, and then it dropped to 2 bars immediately. I think their software is trying to be optimistic or something. It's like magic!
GSM uses a form of CDMA called WCDMA for 3G.
(W)CDMA works by having every phone talking at once, just like picking out a voice in a crowd in a noisy room. The more phones talking to a cell, the louder everyone has to talk to be heard. The overall signal level doesn't matter, but only the usable ratio of your own signal levels to the noise floor.
If a phone displayed this ratio, it would fluctuate wildly as users come and go. So idle phones usually display the steady power level of a transmitted pilot channel from the tower instead. Basically, the closer you are, the higher the level, which a user can understand.
Once you connect, the phone can actually determine the connection quality because then it knows its communication error rate. That's why the bars will fluctuate after connection.
Your phone could show only one bar of pilot signal, but still get a great connection if you're the only one using that cell. Or you could have full bars of pilot signal, but a terrible connection if you're sharing the cell with too many others.
So bars are basically meaningless until connected, and even then only show the quality incoming to the phone, not how well you transmit to the tower.
robbieduncan
Mar 13, 03:51 PM
That's fine for soaking up occasional peak demand (I linked to 'vehicle to grid' techology a few posts back), but not providing energy for a full night... unless you have a link that says otherwise?
The obvious real answer is a globally connected power grid with generation all over the place so as night is not such an issue. Of course we'd need to agree on voltages, frequencies, cost etc.
The obvious real answer is a globally connected power grid with generation all over the place so as night is not such an issue. Of course we'd need to agree on voltages, frequencies, cost etc.
koobcamuk
Apr 15, 09:22 AM
What's LGBT?
Therbo
May 2, 09:41 AM
Please, enlighten us how "Unix Security" is protecting you here, more than it would on Windows ? I'd be delighted to hear your explanation.
A lot of people trumpet "Unix Security" without even understanding what it means.
The Unix Permission system, how a virus on Windows can just access your system and non-owned files, where Unix/Linux dosen't like that.
But of course it dosen't protect agaisn't bad passwords or stupidity.
A lot of people trumpet "Unix Security" without even understanding what it means.
The Unix Permission system, how a virus on Windows can just access your system and non-owned files, where Unix/Linux dosen't like that.
But of course it dosen't protect agaisn't bad passwords or stupidity.
No comments:
Post a Comment