Im, Jiyoung2018-12-212018-12-212018-12-212018-12-15http://hdl.handle.net/10012/14282In this thesis, we study the special case of linear optimization to show what may affect the sensitivity of the optimal value function under data uncertainty. In this special case, we show that the robust optimization problem with a locally smaller feasible region yields a more conservative robust optimal value than the one with a locally bigger feasible region. To achieve that goal, we use a geometric approach to analyze the sensitivity of the optimal value function for linear programming (LP) under data uncertainty. We construct a family of proper cones where the strict containment holds for any pair of cones in the family. We then form a family of LP problems using this family of cones constructed above; the feasible regions of each pair of LPs in the family holds strict containment, every LP in the family has the unique optimal solution at the vertex of the cone and has the same objective function, i.e., every LP in the family shares the same optimal solution and the same optimal value. We rewrite he LPs so that they reflect the given data uncertainty and perform local analysis near the optimal solutions where the local strict containment holds. Finally, we illustrate that an LP with a locally smaller feasible region is more sensitive than an LP with a locally bigger feasible region.ensensitivity analysislinear programmingrobust optimizationpolyhedral theoryoptimization under data uncertaintySensitivity Analysis and Robust Optimization: A Geometric Approach for the Special Case of Linear OptimizationMaster Thesis