What does memory footprint mean anymore?

Discussion in 'Windows Desktop Systems' started by Petros, Jul 13, 2004.

?
  1. Nope

    4 vote(s)
    40.0%
  2. Yup

    6 vote(s)
    60.0%
  1. Petros

    Petros Thief IV

    Messages:
    3,038
    Location:
    Pacific Northwest
    In the ancient days of computing (1999), when people only had 32-64 MB of RAM on average, people fretted constantly about how much memory a program took up.

    I see these uber-l337 computer users with a gigabyte of ram contemplating changing bittorrent clients because one uses 37 MB and the other uses only 28 MB. Does it really matter when you have so much?
     
  2. NetRyder

    NetRyder Tech Junkie Folding Team

    Messages:
    13,256
    Location:
    New York City
    If there are two apps that do the same thing, and one uses less RAM than the other, I usually go with the one that consumes less.
    However, if a better app uses more memory, I don't bother much. If it's worth it, I'll use it regardless.
     
  3. Maveric169

    Maveric169 The Voices Talk to Me

    Messages:
    1,148
    Location:
    Elkhart, IN
    Yup what netryder said, makes really no difference to me. I have a Gig of RAM, not much I can't run with that much ram.
     
  4. Dublex

    Dublex Quazatron R6 droid

    Messages:
    624
    Location:
    Hertfordshire, UK
    Its more about the quality of the app itself rather than the memory usage. I have 512MB which is mostly enough although I am upgrading to 1GB soon, and still would rather use more memory for better managed features than something that uses less memory but is not as good.

    Good example: StyleXP vs. Windowblinds
     
  5. Glaanieboy

    Glaanieboy Moderator

    Messages:
    2,626
    Location:
    The Netherlands
    I have 256 MB installed, so memory footprint is kinda important to me. Although I don't startup much programs at the time, so I don't care usually.
     
  6. dave holbon

    dave holbon Moderator

    Messages:
    1,014
    Location:
    London England
    I’ve got 512Mb of ram installed (XP Home) and habitually check the swap file for use or access. Can’t find any application, including AutoCad 2005 being used with Bryce at the same time that has accessed the swap file for normal memory use in recent months. This indicates to me that 1gig of memory usage is outside of most commercial software at present even on large projects. However the message passing system known as both the XP operating system which is in fact based on a now old Unix based kernel with bolt on bit’s, and the resulting memory management model as used by most software programmes accessing it which have their roots in MS Dos have a history and culture of conserving memory so this never seems to that much of an issue. Games are another story as are professional based graphics modellers using multiple CPU’s. But this is another story.

    As memory prices fall programmers and programming techniques will become sloppier and so memory usage will increase. This is to some extent already happening, just look at the difference between Office 2000 and 2003. Memory use has never been an issue, memory leakage is more important usually caused by poorly written programmes where optimisation of the basic code has never taken place due to cost issues.
     
  7. Perris Calderon

    Perris Calderon Moderator Staff Member Political User

    Messages:
    12,332
    Location:
    new york
    GREAT TO SEE YOU DAVE!!!...where ya been?

    anyway;

    sloppier is not the right word Dave, this is a little too cynical, no?

    when programmer knows users have more memory, they would be poor programmers indeed if they didn't take advantage of the resources that are available wouldn't they...graphics, user interface, everything will improve if the programmer knows more resources are available.

    for instance, when a programmer knows a user has dual CPU's, he'd better take advantage of those extra quanta, otherwise his program will not be as good as it could be...same thing with memory...if I'm a programmer, and I know most users have a gig of memory, you better believe graphics and animation are gonna reflect what I know the users hardware can handle

    for instance, I have 512, which used to be fine, but now that I'm running a gps and leaving it running while I'm doing other things on my box at the same time, my memory goes under pressure all the time.

    anyway, great to see you...better come around more often
     
  8. dave holbon

    dave holbon Moderator

    Messages:
    1,014
    Location:
    London England
    PERIS:

    Well I’ve been doing mostly this and that and sort off, well chilling in (as opposed to out) and watching television programmes about the area of prohibition in America and it’s implications in the rise and politicisation of mobsters and some large corporations and the Kennedy family, all who’s fortunes were made in this era.

    Any way I suppose that the way forward, as I see it in the personal computer word must be towards the direction of multiple CPU’s. This is already happening with all CPU manufactures now nearly ready to produce two CPU’s on a chip or more. This is not new technology but old, and has been around for more then ten years now. In only a few years time when you by a CPU it will contain 1gig memory on the chip itself and four CPU’s along with all the memory controllers and so on. This is already on the drawing boards of AMD and Intel. Having said that I remember ten years ago the single PC on a chip being mooted, this never happened for various reasons but mostly production tolerances, ever changing standards and a lack of flexibility in both the manufacturing process itself and software reliability which is still an issue today.

    Had disk drives of the mechanical variety whilst cheap at the moment are already obsolete as a technology, and have been for some years. What’s the point of having a mechanical device (including CD/DVD writers etc) if you can just plug in your 50gig solid state memory pen into whatever port to transport data.


    :) :) :)
     
  9. Henyman

    Henyman Secret Goat Fetish Political User

    i have 1024mb of ram, and to be honest i dont look at my programs footprint. i like the progs i use so i let them use what they need :s
     
  10. Ferretlovers

    Ferretlovers Giver of 2 Cents

    Messages:
    40
    I have 512meg of mem, and I am constantly looking at the footprint of the progs because I play Neverwinter Nights and that needs all of the mem and CPU time it can get, so I am constantly "Trimming the Fat" whare I can.

    Take Care,
    Mike
     
  11. NetRyder

    NetRyder Tech Junkie Folding Team

    Messages:
    13,256
    Location:
    New York City
    If Longhorn/WinFX continues on the road it's on right now, .NET managed code will make it so that we don't have to worry about memory leaks in the future.
    [​IMG]
     
  12. _ronin

    _ronin OSNN Addict

    Messages:
    55
    Speaking of torrent clients, ABC is by far the best. I've tried Azeurus on the recommendations of my peers, but it would quite literally, send my computer to a screetching halt.

    I have what is considered to be a fairly fast computer:

    2600 XP
    512MB DDR400 dual channel

    Plenty of harddrive space; often observed to be a snail race with just the bittorrent client running against a freshly booted OS. Oddly enough, the resource consumptions are forgivable, but the performance seems otherwise. ABC has been the best of the lot.