…It’s pissin’ me off. Anyway, the problem is this:

Write a program to input a sequence of numbers and output the minimum and maximum values in the sequence.
Example:
How many numbers? : 8
Input numbers : 11 7 19 5 2 5 2 10
Minimum = 2
Maximum = 19

And so far, I have this:

include <stdio.h>

include <conio.h>

int main (int args, char ** argc){
int n,i,a[6],max,min;

printf("

");

printf("n=");scanf("%d",&n);
for(i=0;i<n;i++)
{
printf("a[%d]=",i);
scanf("%d",&a[i]);
}
do
{
(i++);
}while(i<a(0)*);
min=a(i)*;
max=a(n-1)*;
printf("The max of the elements is %d

", max);
printf("The min of the elements is %d
", min);

printf("

");

return 1;
}
*The bracket around these are square.

Anyway, I’m not sure how to make it so the max and min show up as the actual max and min. Could someone show me how and explain it to me?

The easiest way, without showing you actual code is as follows: Set min and max both to be the first number in the sequence. Then go through each number in the sequence and compare it to both min and max. If it’s larger than max, set max to it, and if it’s smaller than min, set min to it. At the end of the loop, min will be the smallest number and max will be the largest.

What you want to do is include a checker inside the for loop and do the assignment to max for the first number like cid said. Something like this, maybe:

bool firsttime = true;
printf("n=");scanf("%d",&n);
for(i=0;i<n;i++)
{
printf("a[%d]=",i);
scanf("%d",&a[i]);
// if firsttime is true then {
// make both max and min equal to a[i], which should be a[0]
// set firsttime to false
// }
// if not, then... {
// if a[i] is greater than max then {
// a[i] becomes the new max
// }
// if a[i] is smaller than min then {
// a[i] becomes the new min
// }
}

The problem there might be an extra semicolon. Make sure that there isn’t one after your if statement (only inside the statement). Also, the “less than” doesn’t show up on the boards, and messes up the code, so use < instead.

I am so sick of these stupid programming languages. Why can’t we use a language with a real mathematical basis instead and recogize min and max to be the catamorphisms they truly are.

I don’t think a semicolon’s the problem (and you also used greater than signs where you’re supposed to use smaller than signs). The problem is your logic: if you set min to -10000 and max to 10000, then any number between -10000 and 10000 will never show up as max or min. You meant to do the inverse, although even doing that is not the best idea. What if every number that someone put in was great than your set max or smaller than your set min? The best way would be, likd Cid said, to set the very first input as both the max and min.

These are incorrect logically as Cless said. They should refer to the zeroeth set of the elements being compared.

max=a[0];
min=a[0];

This means that if the first value is the highest, it will stay that way and vice versa.

Now you need to compare with the next item in the set for both. An iterative approach is obviously the best way to do this.

while(i=0; i (Frigging less than) n.length; i++) {
if(min > a[i])
min = a[i];
else if(max (Frigging less than) a[i])
max = a[i];
}

Obviously if its the smallest number in the set, you dont need to worry about checking it to the highest number (hence the else if). Also, you need to check when you fall out the back of the array and also check for a null array input.

That code probably wont compile, but im sure you can fix it up no probs. Its possibly cleaner than your implementation, but dont let that get you down

Edit! Had my less than/grerater than on the wrong value, resulting in max being min and min being max