Skip to content

ablaamim/Minishell

Repository files navigation

💀 Minishell : How to succeed to fail and lose all your friends!


📖 Quotes :


$> echo “The Linux philosophy is to laugh in the face of danger 8D !".

$> echo “All operating systems sucks, but Linux just sucks less.”

$> echo “An infinite number of monkeys typing into GNU emacs would never make a good program.”


Introduction :


The objective of this project is to create a simple shell and learn a lot about processes and file descriptors. The existence of shells is linked to the very existence of IT. With Minishell project, we’ll be able to travel through time and come back to problems people faced when Windows didn’t exist.

Besides that it is meant to make you suffer, read a lot of documentations, lose your sanity and become bugs intact.

This is a manual to read RELIGIOUSLY before starting man bash


🌀 Built-in functions to implement :


Command Description
echo Echo the STRING(s) to standart output. -n flag: do not output the trailing newline.
cd Change the shell working directory (with a relative or absolute path).
pwd Print name of current/working directory.
export Set export attribute for shell variables.
unset Unset values of shell variables.
env Print the environment.
exit Cause the shell to exit with the exit status specified.

🚧 Project Organization in root of the repository :


.
.
├── includes/
├── srcs/
├── Makefile
.
.

Makefile has two versions of compilation rules, one for macos and the other for linux, please use the appropriate one for appropriate OS.


FIRST STEP : LEXICAL ANALYSIS / LEXER TOKENIZER.


It is the first step of the parsing phase, it will help me out to transform user input into tokens all because my parser is subdivided into two parts, lexical analysis and tokenizing.

So first of all i defined a lexer class as an enumerator of all possible char types that can be encountred inside an input string --> For further infos check /includes/minishell.h

-> inside the shell loop i send the input string to ft_lexer_parser_program() this function will verify the validity of input string, it should only contain grammar defined inside the enumerator (lexing), also it escapes whitespaces before and after every token analysed.

-> then i call linked_list_constructor() this function will build a linked list where every node is considered as token.

-> A token is defined in a class enumerator as well in order to manage all tokens that should be handled by program --> /includes/Minishell.h

  • If valid ==> parse it and execute it.
  • Else ==> print error on stderr.
$> ls | wc -l ---> valid token it will be executed.

$> |; ---> should print an error "unexpected token"

So in order to make this work effectively i used dispatch tables (array of pointer functions) to call the appropriate tokenizer accordingly with the appropriate token.

Mandatory tokens :

Word token :

It must respect lexing, so it should be a valid grammar.

I used a dispatch table to define a word, it is basically a combination of any character, it could have a single or a double quotes as a part of it as well.

Redirections token :

'>' , '>>', '<', '<<', every two are handled in one function, i only defined greater and lesser in grammar, so i increment pointer if i find a similar symbol then its >> in exemple of >, so i save >> in a token, same for < and <<.

Algo : if *i == '>' ===> if (*i) == '>' then token is '>>' else token is '?'

Separator token :

'|', '||', '&&', ';'

-> Build a linked list based on lexed tokens.

-> Send it to ast_constructor() to init parsing and represent data in memory.

-> Then i build an AST : next step which is parsing.


SECOND STEP : PARSER

in this phase i parse logical operators first : ';' '||' '&&'

the or logical operator '||' has same logic as redirection in parsing, if i find the first | i increment pointer in order to check if next character is '|' so i define it as '||'.

->In AST i give priorities to logical operators, so i put them in root ->Then i look for pipeline in linked_list of tokens so next child should be pipe if find it in linked list tokens ->Then i parse simple Command, command options and redirections are a part of it as well

xample of a simple cmd : ls -la > file


$> SIMPE_CMD | SIMPLE_CMD && SIMPLE_CMD



											[ LOGICAL OPERATOR ]

									[PIPE]							[SIMPLE CMD]

						 [SIMPLE CMD]			[SIMPLE CMD]

$> ls -la | wc -l && echo "listed all"



												[ && ]

									[ | ]					[echo "listed all"]

							[ls -la]		[echo "listed all"]

PIPELINE :


$> cmd1 | cmd2 | cmd3 | cmd4 | cmd5

									[ | ]

							[ | ]			[cmd5]

						[ | ]		[cmd4]

					[ | ]		[cmd3]

				[cmd2]		[cmd1]

AST REPRESENTATION :

Ast has two node types :

-> leaf : simple command which is constructed of a command option argument and any type of redirections.

-> child node : its either a logical operator or a pipe.

EXAMPLE 00 :

EXAMPLE 01 : pipes and redirections and semicolon

$> <<ok > file && cat file | wc -c ; echo "hello"

						[;]

			[ && ]				[echo "hello"]

	[<<ok > file]		[|]

			[cat file]		[wc -c]

EXAMPLE 02 : and operator and simple commands

$> echo "hello" && ls -la > file && cat file

								[&&]

					[&&]					[cat file]
	[echo "hello"]				[ls -la > file]

EXPANSIONS : Should be managed before execution :

[To be continued]


About

As much ugly as a shell is

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published