The following program dumps core with a floating-point divide error on UNIX. This is puzzling because we do no floating-point operations.
In order to find the problem, we've put in a few printf statements and discovered that it's happening somewhere before the function call. We can tell this because we never see the "starting" message.
1 /************************************************
2 * Compute a simple average. Because this *
3 * takes a long time (?) we output some *
4 * chatter as we progress through the system. *
5 ************************************************/
6 #include <stdio.h>
7
8 /************************************************
9 * average -- Compute the average given the *
10 * total of the series and the number *
11 * of items in the series. *
12 * *
13 * Returns: *
14 * The average. *
15 ************************************************/
16 int average(
17 const int total,// The total of the series
18 const int count // The number of items
19 )
20 {
21 return (total/count);
22 }
23
24 int main()
25 {
26 int ave; // Average of the number
27
28 printf("Starting....");
29 ave = average(32, 0);
30 printf("..done\n");
31
32 printf("The answer is %d\n", ave);
33 return (0);
34 }