A little back story. My company is starting to replace our remote thin clients with iMacs. Right now the image we use logs the mac into a pretty locked down user account on boot. Then they can then launch a vpn client, SIP phone, and RDP client to get into our terminal servers to do work. That part all works great more or less.I am working on a script that runs on login that goes out to an ftp site and downloads any updates to local software/documentation. This part is also more or less done and works pretty well. I wrote a wrapper script that grabs a bunch of other scripts of an FTP server, tosses them into a local directory, and then runs them all one by one.What I would like to do is have the wrapper script redirect all standard out from when it starts to when it ends, to a file.So if the script was#!/bin/tcsh -f<INSERT MAGICAL REDIRECT TO FILE /tmp/log HERE!!!>echo "whatever"ls -l /User/SharedThen /tmp/log would contain:whateverGarageband demoSongsGoat pornRather than putting a >> /tmp/log after each command. Is this possible at all?Now I know you're thinking "what the hell you retard? why dont you just redirect the output when the script is called!?!"Well i'd love to just do a script.sh >> /tmp/log but the startup script is being run from some thing called a LoginHook in OSX. If i have the login hook set to run just my script (/User/Shared/script.sh) it seems to work fine. If I change the login hook to /User/Shared/script.sh >> /tmp/log it stops running.Aside from the loginhook thing, it also might be handy to have the script be able to determine on the fly where its going to output stuff. Instead of having the output file locked in place in the LoginHook.So my questions are:Is there a way to redirect ALL stdout(and err) in a script?Is there a better way to do a login script than this goofy loginhook shit?Is there something I've missed that will make the above moot?thx[Edited on January 2, 2008 at 5:11 PM. Reason : fucckin >>>>][Edited on January 2, 2008 at 5:14 PM. Reason : ,,,]
1/2/2008 5:07:28 PM
Can you create 2 scripts, one that is your actual script, then another that executes the script with the output redirected?Another option is to create a variable $OUTPUT and for every command with an output in the script itself do >> $OUTPUT then just set $OUTPUT at the top of the file.I would think surely there's a way to redirect stdout globally though...[Edited on January 2, 2008 at 8:28 PM. Reason : ]
1/2/2008 8:26:40 PM
dammit, i was really looking forward to saying "what the hell you retard?"
1/2/2008 8:57:11 PM
I figured it out!http://learnlinux.tsf.org.za/courses/build/shell-scripting/ch12s04.htmlThat website details the process.basically, you doexec 1>file.logand all stdout is redirected for that shell to file.logI'm not sure how this handles error, but that site should get you going.Read up on file descriptors as well if you don't know what those are.
1/2/2008 9:11:38 PM
In most linux and bsd distros there is a utility called 'script' to do this. Explained here:http://linux.byexamples.com/archives/279/record-the-terminal-session-and-replay-later/
1/3/2008 10:58:42 AM
why can't you have the script run on like tty3 and put a call for tty3 > file.log? I know shaggy's tried this, but is there a reason it doesn't work?
1/3/2008 11:50:24 AM
1/3/2008 2:06:14 PM
This works in a bash script:exec 1>> $LOGFILEexec 2>> $LOGFILEThe first line redirects standard output, as moron said.The second line redirects standard error.[Edited on January 3, 2008 at 2:08 PM. Reason : 78 seconds too late]
1/3/2008 2:07:32 PM
we have a story-teller
1/4/2008 4:58:04 PM