"The Bayesian Lasso" by Trevor Park and George Casella implements the Bayesian Lasso considering the linear models by assigning the parameters on the scale mixture of normal (SMN) priors and variances by their exponential priors which are independent. These parameters can be explained as a Bayesian posterior mode estimate, which can also be obtained by using conjugate normal priors on regression parameters and hyperpriors that are exponentially independent in a Gibbs Sampler. This paper gives formulation of comprehensive hierarchical model and an overview of how Gibbs sampler is implemented. It also summarizes the ways to select the Lasso parameter by providing both Bayesian and likelihood methodologies. Also, the methods characterized can be implemented to other methods relating to estimation of Lasso, such as bridge regression, Huberized Lasso and robust variants. The study presents a brief explanation of the ordinary Lasso, ridge regression and Bayesian lasso regression in consideration of their model formulation and deriving insightful results using these models. It also highlights the practical implementation of Gibbs Sampler for Bayesian lasso and provides a theoretical approach of methods to choose the lambda values. The paper then talks about the ways of choosing the Bayesian Lasso parameter, namely the empirical Bayes through marginal maximum likelihood and by using an appropriate hyperprior. In this literature review project, we wanted to implement the Bayesian Lasso regression model and its methods in comparison to the ordinary least square regression, ridge regression and ordinary lasso regression. As a result, we implemented the diabetes example given in this paper onto a real-world dataset to gain a proper understanding of how the models work and then comparatively estimated the results via graphs and plots.