slow machine performance with read/write txt files...

Discussion in 'AutoCAD' started by CJ Follmer, Jul 31, 2003.

  1. CJ Follmer

    CJ Follmer Guest

    Where I work, we are unfortunately stuck in the R14 world and will be for a
    while yet. :( I've been making routines to help make 14 more usable and
    have really been upgrading their whole cad system. I'm having a concern
    though if some of my intentions might have bad affects on performance,
    mainly at startup.

    I've loaded up acad.lsp with Autoloads where applicable and variable
    checkers to correct things. Just recently added a function to allow the
    user to have their own Osnaps setting reset. R14 remembers the setting in
    the dwg and not the registry. My routine loads a small ascii text file and
    sets Osnap accordingly plus a setting for ATTDIA and MODEMACRO. A couple of
    possible future routines I'm working on is a Today like startup to be used
    when the user starts Autocad with the normal Drawing.dwg startup file and
    one which works like the WHOHAS routine. Both require that I read & write
    to text files. The Today-like routine tracks the user History and stores
    that file locally but the WHOHAS will store it on the network and will
    probably have to make Dirs as it goes.

    The majority of the machines here are 600 MHz plus (the fastest are 2.0 GHz)
    but they've got several which are 400 or even 300!!!! My question is what
    kind of performance load will my routines be placing on machines like that?
    Is it possible to construct a speed test that would allow the routines to
    know which computers may perform poorly and simply not run the routines?

    any thoughts?

    thanks
    CJ
     
    CJ Follmer, Jul 31, 2003
    #1
  2. CJ Follmer

    Mark Propst Guest

    I'm no expert but I've found that reading and writing (even on a 450mhz) is
    blinding fast. But other operations in the routine may cause slow downs.
    like in a vba routine I was strcating the lines that were read from a text
    file and it was really slow, but it was the strcat not the read.
    like in lisp, append will be really slow but cons will be faster, but both
    will be slower than just a read loop (assuming you're doing something with
    the lines you read, like putting in a list or strcatting a string or ???)

    there are many ways to time your routines, depends on your needs how you'd
    structure it

    my own very primitive timer
    ;;---------------------------------------------------------------------;
    ;;; Function: Timein ;
    ;;;--------------------------------------------------------------------;
    ;;; Description: Starts timing function to record elapsed time ;
    ;;; between start and end of lisp routine or process ;
    ;;; written by : Mark Propst ;
    ;;;--------------------------------------------------------------------;
    (defun Timein()
    (setq *TimeIn*(getvar"tdusrtimer"))
    )

    ;;---------------------------------------------------------------------;
    ;;; Function: Timeout ;
    ;;;--------------------------------------------------------------------;
    ;;; Description: Ends timing function to record elapsed time ;
    ;;; between start and end of lisp routine or process ;
    ;;; prints result to command line ;
    ;;; written by : Mark Propst ;
    ;;;--------------------------------------------------------------------;
    (defun TimeOut( / timeElapsed )
    (if *TimeIn*
    (progn
    (setq timeElapsed(* 86400(-(getvar"tdusrtimer")*TimeIn*)))
    (print (strcat "That took " (rtos timeElapsed 2 2) " seconds " ))
    (setq *TimeIn* nil)
    )
    (print "Global *TimeIn* not set")
    );if
    timeElapsed
    )

    usage:
    (timein)
    ....do something
    (timeout)
    ....prints result to textscreen
    I'm sure you can adapt the idea to what you want.
     
    Mark Propst, Jul 31, 2003
    #2
  3. CJ Follmer

    CJ Follmer Guest

    Thanks, the Today-like routine is probably the most demanding as I have it
    sorting and comparing. I make a list from the text in the txt file and sort
    it to remove a file that was just opened to the top of the list if it was
    already in the list further down, which means searching for a match and then
    remaking the list to remove the string and replace it at the top. I also
    remove extra lines that are longer that the allowed history length.

    So that routine may be beginning to be too much for the slower machines.
    I'll have to try it out.

    CJ
     
    CJ Follmer, Jul 31, 2003
    #3
Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.