The effectiveness of peer reviews in academic publishing is often influenced by the balance between brevity and comprehensiveness. Reviews that are too verbose or too concise may fail to convey critical insights, reducing their utility. This paper presents a novel heuristic system for dynamically optimizing peer review lengths, leveraging information density and argumentation metrics. Using a curated dataset that contains quantitative and qualitative metrics such as content relevance, argument strength, readability index, and unique insights per word, we develop a composite score to assess review quality. Our system employs thresholds for normalized length, information density, and adjusted argument strength to classify reviews as poor, moderate, or excellent. Through empirical refinement and analysis, the heuristic framework demonstrates its ability to enhance review quality by balancing word count with clarity and argument consistency. Scatter plots and histograms reveal critical relationships between composite scores and key metrics, offering actionable insights into optimal review lengths. The results highlight that optimizing peer reviews can significantly improve their quality and relevance. This study provides a foundation for integrating heuristic systems into academic review platforms, ensuring that reviews achieve the desired balance of brevity, depth, and clarity.