The following problem is motivated by one of my research problems. If you can solve this before me, I will include you as coauthor in my work.
Let
$ \Sigma$ be an $ n \times n$ correlation matrix whose least eigenvalue is denoted by $ \lambda$ .
$ \Sigma_i’$ be an $ (n1) \times (n1)$ submatrix of $ \Sigma$ obtained by eliminating the $ i$ th row and $ i$ th column and whose least eigenvalue is denoted by $ \lambda_i’$ .
Find an upper bound on the gap $ $ \Delta := \min_{i \in \{1,\dots,n\}}(\lambda_i’ – \lambda)$ $
Empirical Observations:
Empirically, I am making some observations about $ \Delta$ :

$ \Delta \leq $ the mean of the absolute values of all offdiagonal entries of $ \Sigma$ .

$ \Delta \leq \frac{C_{j}}{n1} \leq \frac{C_{j}_1}{n1}$ for all $ j \in \{1\dots,n\}$ , where $ C_j$ denotes a column vector of $ \Sigma$ including all offdiagonal entries.
Note that the bound in point 2 is strictly tighter than the one in point 1.
Furthermore, if the weights in the eigenvector corresponding to the least eigenvalue of $ \Sigma$ are of the same sign (+ve or ve), then the above observations are true even for the signed entries of $ \Sigma$ , i.e.,
 $ \Delta \leq \alpha$ , where $ \alpha$ denotes the mean of the values of all offdiagonal entries of $ \Sigma$ .
 $ \Delta \leq \frac{\sum_{i=1,i \neq j}^{n}c_{ij}}{n1}$ for all $ j \in \{1,\dots,n\}$ , where $ c_{ij}$ denotes the entry in $ i$ th row and $ j$ th column in $ \Sigma$ .
The negative sign in both equations suggests that negative correlations promote stronger $ \Delta$ , which is strongly evident empirically. However, theoretically I have only managed to get looser upper bounds so far. Specifically, the following is what I have at the moment:

$ \Delta \leq C_j\leq C_j_1$ for all $ j \in \{1,\dots,n\}$ , where $ C_j$ is same as above.

There exists $ j \in \{1,\dots,n\}$ , such that $ \Delta \leq \frac{C_{j}}{\sqrt{n1}}$ , where $ C_j$ is same as above.
Any improvements to this, or, alternatively, counterexamples to empirical observations are most welcome and will be acknowledged in my research. In particular, if anyone can prove this last bullet 2 for all columns, that would be great. Thanks!