GNU make is freely available and works well on all of our Unix platforms, so we actually replace the vendor-provided make to avoid confusion and problems with Makefile compatibility. (The vendor-provided make with Linux systems is already the GNU version.) There are Info Pages and Man Pages installed on our Web site.
Make.Common is a Makefile intended to be included by all project directory Makefiles. It is designed to make the project Makefiles as simple as possible. To use it, include it twice, using the include directive, in your regular Makefile:
# Example Makefile # Include Make.Common to set all make variables to defaults: include ../Make.Common # ... Project specific variable definitions and targets go here ... # Include Make.Common again to define the targets for make: include ../Make.Common # End of Makefile.
(See Make.Common itself for more examples.)
You can only have one compiled executable or library per subdirectory and the executable or library must be named the same as the directory it is in. You can, however, create a directory structure as deep as is needed to group related programs and libraries together. Just remember to create a symbolic link to Make.Common in the level above whenever you make a new directory tree. (See how /cfht/src/pegasus/cli/ is set up for some examples.)
Shell scripts must all be named *.sh and appear in a subdirectory anywhere in your tree called ./scripts/. There can be multiple *.sh in a single ./scripts/ directory. When these are installed in the /cfht/bin/ directory, the .sh is stripped off. So they are invoked as their basename, i.e., without the .sh.
Similarly other non-compiled files like configuration files, parameter files, and bitmap files are kept together in a directory called ./confs/ and will get copied to /cfht/conf/ during a `` make install'' if needed.
gdb --symbols=/cfht/obs/dumper/dumper --exec=/cfht/bin/dumper --core=.../core
CCWARN += $(WERROR)
If there is a Make.Local in the toplevel src/ directory which defines "WERROR = -Werror", as we have in our source tree at CFHT, then these projects will fail to build when warnings are introduced.
So you can do things like this in the C code:
#include <sys/socket.h> #ifdef SUNOS /* Old Suns lack prototypes for these... */ int socket(int domain, int type, int protocol); int bind(int sockfd, struct sockaddr *my_addr, int addrlen); int listen(int s, int backlog); #endif . . .
It may also be useful to know the date of the most recently modified source file, the compilation date, and the version number from the Changes file inside the C source. All three of these can be added by putting the following in the Makefile:
CCDEFS+=$(VERSIONDEFS)
And then the C program would see the following three defines:
int main(int argc, const char* argv[]) { printf("Program FOO version %s %.2f %s (compiled %s)\n" VERSION, SOURCEDATE, BUILDDATE); . . . }
The above example assumes the version number should have two digits displayed after the decimal. Note that VERSION, SOURCEDATE, and BUILDDATE are only available if the project Makefile adds VERSIONSDEFS to CCDEFS.
In addition, the following -D's may come from Make.Common under the following conditions:
-DNO_CFHTLOG - If /tmp/pipes/syslog.np was not found at compile-time. -DUSE_EPICS - If this machine is one on which we want EPICS channel access.
On our primary development system, HP-UX 10, only -DHPUX usually appears, and possibly -DUSE_EPICS.
CCLINK += -lm
CCINCS += $(CCINCSX11) CCLIBS += $(CCLIBSX11) CCLINK += $(CCLINKX11)
CCLINK += $(CCLINKNET)
Because some platforms (namely new Sun Solaris) need to add "-lsocket" and "-lnsl" in the link stage for the proper network routines. Other platforms may need a "-lresolv" here also for hostname lookup. Simply using CCLINKNET takes care of whatever might be needed here.
SUBDIRS = libdet cdma chmem deti detio lu trafficoff ifeq ($(TARGET),SunOS-4) SUBDIRS += detserver rdmem wrmem endif
ifeq ($(NO_CFHTLOG),) CCLINK += -lcfht endif
And something like this in the C source code:
#ifdef NO_CFHTLOG syslog(syslog_type, message); #else cfht_log(CFHT_MAIN, cfht_log_type, message); #endif
Although I wouldn't recommend it, it is possible to go through Make.Common and search for all $(INSTALL)'s and replace them with regular "cp" or "mkdir" commands (depending on whether it is installing a file or a directory.) GNU fileutils are not difficult to install, so try that first.
Each document should have some kind of Web accessible version. A format which can be keyword-searched by standard search engines is also valuable. While many search engines can now scan PDF files, HTML is a much more widespread format, with built-in support in every Web browser client application. So HTML, or something which gets converted to HTML at the Web server end on the fly, is the most desirable format for the Web.
Since HTML Web pages do not contain page numbers, and usually also contain graphics that are already rendered at a low resolution (on the order of 100dpi for a typical monitor) they are generally not suitable for printing at 600+ dpi. A link to a hardcopy in either PostScript or PDF (preferably both) should be provided on the Web page.
MS Word or Frame files are not acceptable as distributable, printable copies. Many word processing applications have a save-to-HTML option, and also can create output for a PostScript printer or generate PDF format.
These give mixed results, but as long as the results are decent for both the HTML and printable versions, any of these are acceptable.
Writing documentation can be a little like processing astronomy data (though less interesting). We typically come up with a combination of tools and recipes that work well. This section describes the particular recipe used to generate this document. It produces reasonably good results, and has the advantage that everything is in plain text source which can be generated, preprocessed, and re-used in the automated fashions to which programmers and astronomers are accustomed. There is no mandate to use this particular method.
Figures are generated with XFig, a simple 2D drawing program. They are saved as Encapsulated PostScript with an appropriate scaling factor. Both XFig and LATEX handle a variety of image formats, but a non bitmapped format like EPS gives the nicest results. (GIF, BMP, and JPEG are all bitmapped formats.)
The document itself is written as a LATEX article. The document has this basic structure:
\documentclass{article} \usepackage{fullpage} \usepackage{times} \usepackage{html} \usepackage{epsfig} \title{DOCUMENT TITLE} \author{YOUR NAME} \date{DATE HERE} \begin{document} \maketitle \begin{abstract} . . . Abstract here . . . \end{abstract} \tableofcontents \listoffigures \section{FIRST SECTION} . . . \section{SECTION SECTION} . . . \end{document}
\htmladdnormallink{text for the link}{http://url/}
\begin{figure}[ht!] \begin{center} \caption{CAPTION FOR FIGURE GOES HERE} \label{optional_label_for_references} \epsfig{file=YOURFILE.eps} \end{center} \end{figure}
No matter what, you cannot assume that LATEX will place the figure at the exact location where these commands appear. Often, it will create a separate page and put all the figures together. If this is undesirable, you can tune this behavior a little with the following commands:
\renewcommand{\topfraction}{0.99} \renewcommand{\textfraction}{0.1} \renewcommand{\floatpagefraction}{0.99}
The LATEX source itself can be edited as raw mark-up in a text editor (similar to editing raw HTML) or a package like LyX can be used for this job. Given the .eps Encapsulated PostScript files and the tex file with the document, it is then possible to generate all the output formats in one step with a simple Makefile and our friend GNU make:
# -------- Makefile for a LaTeX document. --------------- DOCUMENT=YOURDOC FIGURES=FIGURE1.eps FIGURE2.eps FIGURE3.eps L2HOPTIONS=-antialias -no_antialias_text -split +1 \ -show_section_numbers -long_titles 4 # ------- The rest of the Makefile is generic. ---------- all: $(DOCUMENT).ps $(DOCUMENT).pdf $(DOCUMENT) $(DOCUMENT): $(DOCUMENT).ps .latex2html-init latex2html $(L2HOPTIONS) $(DOCUMENT) $(DOCUMENT).pdf: $(DOCUMENT).ps ps2pdf $< $@ $(DOCUMENT).ps: $(DOCUMENT).tex $(FIGURES) latex $(DOCUMENT).tex latex $(DOCUMENT).tex dvips $(DOCUMENT).dvi -o
The Makefile will automatically regenerate all formats (HTML, PS, and PDF) with the single command ``make'' whenever a figure or the document have been updated. To generate the PDF, it uses ps2pdf, and to generate the HTML it uses LATEX2HTML. The latter has many features which can be controlled from the LATEX source. Check the manual for more information.