shak's Help 

  1. System Concepts
  2. Managing Programming Contests
  3. Mooshak interfaces
  4. Frequently asked questions
Mooshak's Help

Mooshak's Help

  • System Concepts. A must for begginers
  • Managing Programming Contests
  • Mooshak interfaces
  • Frequently asked questions
Mooshak's Help: Concepts

Mooshak's Concepts

In this page we start by introducing some background that relates to Mooshak and then give information on how Mooshak works and the decisions taken in order to judge contestants' submissions and how they are ranked in a contest.
Programming Contests
Brief background on programming contests: how they started, why are they useful, where do they take place.
About Mooshak
Short explanation on what is Mooshak and its main features.
Automated Judging
Explains Mooshak's approach to automated judging. Furthermore, explains the kind of messages might result from a submission evaluation and the rules to rank teams during the contest.
Ranking Rules
Details the rules used to produce a temas classification.
Security
Details Mooshak's approach towards ensuring system security and integrity.
Reliability
Details how Mooshak replicates data and computations to ensure system reliability.

International Programming Contests

The ACM International Collegiate Programming Contest is a programming world championship for college students organized and conducted yearly by the ACM. It started in 1970 as a local contest somewhere in Texas and has since grown exponentially in the number of participating universities each year. The numbers are impressive: in 2000 there were more then 2700 teams, from 1079 universities, 70 countries, participating in 42 regional contests distributed among 82 locations.

The ACM programming contest provides students with an opportunity to demonstrate and sharpen their problem solving and computing skills. Apart from the fun of competing (and hopefully winning), the contest is also an excelent opportunity to make international contacts in computing science. The Contest is a two-tiered competition among teams of students representing institutions of higher education. The winning teams of the regional contests (held from mid-September to mid-December each year) will go forward to the contest world finals which are held in the following Spring.

The Regionals and World Finals usually comprehend a 5-hours programming contest with 8 or 9 problems to be solved by teams. Teams are composed by upto 3 students and may submit their program solutions in a number of programming languages, usually: Pascal, C, C++ or Java. A submitted solution is declared as accepted if it successfully produces the same outputs, for a set of input tests, as those of the jury. The team that solves more problems in less (accumulated) time, is declared the winner.

These events are coordinated by the ACM-ICPC (International Collegiate Programming Contests; directed by Professor Bill Poucher), under the patronage of ACM and sponsored world wide by IBM.

There are other similar contests taking place:

  • Preliminairy contestes: these are contests mainly directed towards selecting teams for the Regionals; usually take place at University and/or Country-level.

  • Online contests: these may be contests running simulatneously with some other official contest (local or regional), or is just a team preparation contest. The following sites frequently organize programming online contests: Valladolid and Ural University.

  • 24-hours Judge: these are associated with sites that function as problem Archives. They allow site members to submit solutions to the problems in the archive. The members are ranked by the numbers of problems solved. The following sites are two of the main problem archivers with a 24-hour judge: Valladolid and Ural University.

Mooshak's Help: Main features

What is Mooshak?

Mooshak is a client-server application to fully manage and run programming contests. It is web-based and therefore all of its functionalities are accessible through interfaces deployed on a web-browser, irrespective of the operating system were the browser is running. These interfaces use the HTML 4.0 frame set and no processing is made on the browser, except for some data input validations that are implemented with ECMAScript. Java and plugins were avoided on purpose to simplify the use of the interface by any machine on the Internet.

Main Features

Mooshak provides a number of features, namely:
General features:
  • Multi-user: accommodates different users with rather different access permissions to the system: contest directors, judges, contestants and general public.
  • Multi-site:supports single site as well as multi-site contests; it is also prepared to allow simultaneous local and online contests.
  • Conforms with official rules: conforms with ACM-ICPC rules to classify contestants submissions.
  • Flexible: can import data files produced by ACM-ICPC central registration database; and provides means to produce pariticipation certificates based on the contest classification.
  • Multi-language: supports a number of programming languages, namely C, C++, Pascal, and Java.
System security and reliability:
  • User-authentication: access to the system is controlled by user authentication, except in the case of the public view that is open to everyone.
  • Reliability: supports data replication to allow rapid system recovery in case of failure.
  • Safe-exec process environment: provides a safe execution environment to run the programs. This environment limits resources available to contestant programs, hence preventing them from interfering with the system.
Users and interfaces:
  • Multiple user-views: provides different interface views according to the type of user accessing the system. The interface only shows the functionalities accessible to that user.
  • Administration view: allows contest directors to setup a new contest and to add all data necessary to make it operational; problem descriptions; solutions, test inputs and outputs for the problems; teams and contestants data; programming languages; date for the contest, starting and finishing times, etc.
  • Jury view: allows a person jury to validate the judging made by the automatic judging system, to re-evaluate submissions if necessary, and to answer questions posed by contestants and to track the handling of printouts to contestants.
  • Contestants view: allows contestants to submit solution programs, to ask questions, to print their programs, to access the submissions list, to access current classification, and to visualize problem descriptions.
  • Public view: allows any user on the Internet to follow the progress of the contest as it is taking place; they can access the submissions list, current classifications and other statistics listings regarding the contest.
Mooshak's Help: Automated Judging

Automated Judging

The automated judge is the corner stone of Mooshak. Its role is to automatically classify a submission according to a set of rules and produce a report with the evaluation for further validation by a judge person.

A submission is composed by data relevant for the evaluation process, that is the program source code, the team-id, the problem-id, and the programming language (this is automatically inferred from the source code file extension). Submissions are automatically judged and the corresponding result is almost instantaneously displayed to the teams, although initially in a pending state.

Judge persons have the responsibility of validating pending classifications, making them final, and occasionally modify initial classifications. A classification may have to be modified as a result of changes in the compilation and execution conditions (e.g. changes in test cases) that required reevaluating submissions. Reevaluation produces another report that has to be compared with previous ones.

The automated judging can be divided in two parts according to the type of analysis:

  • Static analysis checks the integrity of submitted data (source-program) and, if successful, produces an executable program.

  • Dynamic analysis is performed upon a successful static analysis phase and is composed of one or more executions of the program.

Static analysis starts by verifying if the submitted problem has already been solved, in which case the submission is rejected and no classification is given. Then it goes on to confirm the verifications made by the interface, i.e. by double checking the submitted data for team ownership, problem-id and size of program source. If it succeeds in this verification, it compiles the submitted program using the compilation command line defined in the administration interface. Mooshak can be more or less tolerant according to the flags chosen for each compiler. An error or compiler warning detected in this stage aborts the automated judging and dynamic analysis is skipped. The following table lists the verifications performed during static analysis and the associated classifications upon failure.

Verifications Classification
Team Invalid submission
Language Invalid submission
Problem Invalid submission
Program size program too long
Compilation Compile time error

Dynamic analysis involves the execution of the submitted program with each test case assigned to the problem. A test is defined by an input and an output file. The input file is passed by the standard input to the program being executed and its standard output is compared with the output file. The errors detected during dynamic analysis determine the classifications listed in the following table.

Severity ClassificationMeaning
6 Requires reevaluation for some reason the program has to be re-evaluated;
5 Time limit exceededthe program did not finish within the allocated amount of time;
4 Output too long the program generates an output too long for this problem; the limits are dependent on the test cases, but are usually low (default limit is around 100KB);
3 Run-time errorthe program ``crashed'', i.e. it exited prematurely due to a run-time error;
2 Wrong answerthe program runs through one or more test cases withouth a run-time error but the output did not match the expected output;
1 Presentation errorthe output seems to be correct but it is not presented in the required format. Since it is not always easy to distinguish this message from the wrong answer message, it is only sent in obvious cases;
0 Acceptedthe program passed all tests and is accepted as correct;

Each classification has an associated severity rank and the final classification is that with the highest severity rank found in all test cases. The highest severity is given to the rare situation where the system has an indication that the test failed due to lack of operating system resources (inability to launch more processes, for instance). The lowest severity is the case where no other error was found, using the test cases, and therefore the submission is accepted as a solution to the problem.

The automatic judge marks an execution as "Accepted" only if the the standard output is exactly equal to the test output file. Otherwise the output file and standard output are normalized and compared again. In the normalization both outputs being compared are stripped of all formatting characters. If after this process the outputs become equal then the submission is marked as having a "presentation error"; otherwise it is marked as a "wrong answer". In the current implementation the normalization trims white characters (spaces, newlines and tabulation characters) and replaces sequences of white characters by a single space. This is a general normalization rule since white characters are only used for formatting. In a specific problem other classes of characters could have the same meaning. For instance, in a problem where the only meaningful characters are digits, other characters, such as letters or punctuation, could be treated as formatting characters. This cannot be done in general since many problems have a meaningful output that includes letters. This feature will require having a meaningful class of characters defined for each problem output. Mooshak's Help: Ranking Rules

Rules to rank teams:

Mooshak uses the rules defined by the ACM-ICPC committee to rank contestants at contests.
  1. The team that solved most problems is ranked first.

  2. Teams that solved the same number of problems are ranked by the least total time.

  3. The total time is the sum of the time consumed for each problem solved. The time consumed for a solved problem is the time elapsed from the beginning of the contest to the submittal of the accepted run plus 20 minutes for each rejected run. There is no time consumed for a problem that is not solved.

Mooshak's Help: Security

Security

One initial security issue is related with the system support for different types of users with different access permissions. This is handled by ensuring user authentication and then associate types of users with different interface views of the system.

The compilation and the execution of programs are the two most insecure points of a contest management system. Provided it fits in a single file, a team can submit virtually any program in one of the contest languages, including a bogus or malicious program capable of jeopardizing the system and ruin the contest. For that reason Mooshak compiles and executes programs in a secure environment, with the privileges of an insecure user and with several limits. Most of these limits are independent of problems, with the exception of execution timeout that is adjusted to each problem. The timeout for each problem is determined before the contest and it is the maximum time taken by the judges solutions, with all test cases, rounded up for the next integer (in seconds). The timeout for compilation is 60 seconds. The other resource limits enforced are listed in the following table with their default values in bytes (except for the number of child-processes).

Maximum limits Value
Process data segment 2 MB
Process stack segment 1 MB
Process RSS 4 MB
Output 100 KB
Source code 100 KB
Child processes 0 KB
Mooshak's Help: Reliability

Reliability

A single Mooshak node - one Web server accessible through a set of Web clients on users machines - is sufficient for running a small programming contest (i.e. a contest with up to 20 teams) where reliability is not at premium. Running an official contest, with a concern for reliability and larger number of teams, distributed in several sites, and a simultaneous online contest, requires a more complex setup, with a network of interconnected nodes.

A link from a node X towards a node Y, represents the direction in which contest data must be replicated (from server X to server Y). The main reasons for replicating contest data between Mooshak servers are to support:

  • System Backup: replication is used to maintain a backup system, with an updated version of the contest data, so that it can replace one of the servers in case of hardware failure.

  • Online Contest: replication propagates the contest data to a server with Internet access used to maintain an online contest simultaneously with an official local contest.

  • Load balancing: several servers distribute load among them and replicate their data to the others. In this case each server is assigned to a set of users, for instance, contestants to a server and jury to another, or contestants in different rooms to different servers.

  • Multi-site contest: This case is similar to the previous but servers are in distant locations.

The Mooshak network configuration for a particular contest may contain several of these links. The following figure represents the network for a contest taking place simultaneously in two sites, A and B, the first using two servers (Server A1 and Server A2) for load balancing and the last using just one server (Server B). Each site has a backup with an updated version of the contest data, capable of replacing any of the main servers in case of failure. Site A maintains also an online version of the contest where anyone on the Internet can compete against the official contestants physically located at either site A or at site B. Some nodes are connected in unidirectional links, such as those connecting servers with the backup nodes or online-contest servers, and other are bidirectional, such as those connecting contest servers among them.

The Mooshak replication uses the rsync remote-update protocol. This protocol updates differences between two sets of files over a network link, using an efficient checksum-search algorithm. The replication procedure is invoked frequently to propagate changes to other servers, typically every 60 seconds, and copies only the data that has been changed since the last replication. The object files produced by the compilation of programs are not replicated, just the evaluation reports. If necessary the programs may be reevaluated in a different machine.

The main issue with replication is the consistency of contest data, namely that no data fails to be replicated or is overwritten by replicated data. To guarantee that no data fails to be replicated we must ensure that there is a replication path connecting all servers interfacing with official contestants.

To address the problem of data being overwritten, we must differentiate between contest definition data (such teams, problems, programming languages) and contest transactions (such as submissions, questions and printouts). Of these two, contest transactions, specially submissions, are particularly important. To guarantee uniqueness all transaction data is keyed by a timestamp, the team ID and the problem ID. Thus, if team ID is unique in the system, and transactions from the same team are consistently sent to the same server, then there is no danger of losing transactions due to overwritten data since each transaction key is also unique.

Contest data is not, in principle, changed after the beginning of the contest. It should be updated in a single node for consistency sake, and that node must have a path to every other node in the network. The only exception to this case is the creation of teams for online-contest servers, as we allow contestants to register during the contest. In case of using load balancing for online-contest servers it is important to assign team creation to a single server. Otherwise, two teams with the same name, and same group, registering at same time in different servers could (although not very likely) share the same record.

For the above setup to work properly, all servers clocks must be synchronized. This can be achieved using the Network Time Protocol (NTP).

Managing Programming Contests

This section summarizes the procedures that contest directors must undertake before, during and after a programming contest in order to run it successfuly with Mooshak.
  • Preparing a contest
  • During a contest
  • Concluding a contest

Preparing a contest

Contest preparation is probably the most demanding part from contest directors. In order to accomplish the tasks described in this section, you first need to have a Mooshak "administrator" account (user admin) and then fulfill the following tasks:

During a contest

A contest starts once it contest is set to the active state within the administrator's control interface and the start-time defined in the contest attributes is reached. Once the contest is started, the top-left corner of mooshak's interface shows the remaining time left till the end of the contest.

The main tasks during the contest are related with judging. Contestants submissions are validated by person judges and therefore the people acting as judges must be logged in the system with judge-accounts.

The judges interface allows them quick access to the main tasks they must face during the contest, which are:

After a contest

Immediately after a contest terminates, the judges go on to validate all pending submissions as quickly as possible. The contest director and judges also check the classification and start preparing for emmitting the contest certificates.

Interfaces

Mooshak has 4 types of interfaces, directed for different kinds of users. The acceess to them, except to audience, is controled by authentication:
Administration
Setting the contest, managing the problems, teams, languages, date, starting and finishing times, etc.
Jury
Validating problem submissions, answering questions and tracking printouts.
Contestants
Submit programs, ask question, print programs and view problems
Audience
View classification and other listings
cannot open //interfaces/administration
cannot open //interfaces/jury
cannot open //interfaces/contestants
cannot open //interfaces/audience

FAQ

  • Setup and initial testing
  • Contest configuration
  • Evaluating submissions
  • Email registration

Setup and initial testes FAQ

  • I get a server error after accessing Mooshak's initial page
  • This probably means that your Apache configuration does not support a /cgi-bin/ directory for users. To allow programs to be executed in this directory you should include these lines in the Apache configuration file e restart the server.
    <Directory /home/*/public_html/cgi-bin>
         Options +ExecCGI -Includes -Indexes
         SetHandler cgi-script
    </Directory>
    
  • When I use the save command in the admin's screen I get an error message
  • Mooshak's scripts and data files are installed in a certain OS user's home - by default mooshak - and the CGI scripts should run with the same user. The suexec module of Apache runs CGI scripts in users directories as the corresponding and ensures that scripts cannot be invoked by other users. Mooshak expects suexec in order to run properly.

    If you have this kind of error then you probably don't have suexec installed. Some distributions install suexec by default when you install Apache. Sometimes you may need to recompile Apache with a certain configuration

    Of course that you can just give all permissions to all data files by executing chmod -R 777 data command in Mooshak's home directory but I advise you against it. You will be compromising your contest security.

  • I am using Fedora Core 3 and when Mooshak calls gcc to compile C/C++ programs it produces either Internal Error or SegFault, but the programs compile on the shell
  • People using Mooshak reported that starting Apache in init.d, i.e., httpd start, intead of startssl solved this problem. Apparently this script sets important environment variables

Configuration FAQ

  • I created several teams for a competition. How can I generate passwords for these teams and send them by email?
    Use the command Passwords | Generate-to-archive in the groups folder of your contest. Then download the archive to you local disk, select the password sheets you need and send them by email.
  • I am not going check every submission and I want to mark them automatically as final
    I can do that by setting the Default_state to final in the submissions folder of your contest.
  • Evaluating submissions FAQ

    • I have a submission that Mooshak classifies as ... but when I run it on the shell it works just fine In Mooshak executions are similar to those you launch from the command line, but you have to have in mind that Mooshak uses a safe execution command to execute you program as an underprivileged user. To have exactly the same type of execution you use the following command line
      
      	~mooshak/bin/safeexec --exec $COMMAND_LINE
      
      
      In fact Mooshak invokes safexec with more command line options to enforce limits over system resources (memory, time, etc), but with this command line you may find, for instance, that your program requires certain environment variables that are not available to the safeexec user.

      Also, if you copy data files for the test vectors using the clipboard (CUT-PAST) that were produced in Windows then you should be aware that Mooshak runs in Linux and this system uses a single end-of-file character, instead of the sequence used by Windows.

    Registration FAQ

    • I want to prevent people from registering in a contest

      After version 1.3 email registering is switched off by default. As admin you can to set the Register field in the contest folder to control email registation.

      In versions before 1.3, you can disable registration by removing the register command from the guest profile. To manage the guest profile open the config/profiles/guest folder in the admin interface, them select the register item in the Authorized selector.

    • Registration seams to be working but emails are not being sent To be able to send email the server must have sendmail installed and configured. In the mail_log file in the contests/your_contest/groups folder you should find a pair of lines for each registration, the first written when Mooshak started sending the message and the last when it terminated.
      22.11.2004 18:31:38: sending to zp@ncc.up.pt
      22.11.2004 18:31:38: OK sent to zp@ncc.up.pt.
      
      If an error occurs during this phase then the second line will have the error message generated by the sendmail.