According to a recent LinkedIn post from Rad AI, the company is drawing attention to trade-offs in pixel size design for semiconductor detectors used in medical CT and non-destructive testing applications. The post describes simulation work with 100,000 photon events across seven pixel sizes in CdTe detectors, highlighting how charge sharing can degrade spectral accuracy at very small pixel dimensions.
Claim 55% Off TipRanks
- Unlock hedge fund-level data and powerful investing tools for smarter, sharper decisions
- Discover top-performing stock ideas and upgrade to a portfolio of market leaders with Smart Investor Picks
The post suggests that, at high photon flux levels typical of clinical CT, larger pixels around 250–300 micrometers mitigate excessive charge sharing while preserving spectral features such as tungsten characteristic peaks. By contrast, industrial NDT can operate with smaller pixels because flux levels are 100–1,000 times lower, easing count-rate constraints and enabling higher spatial resolution.
According to the post, the optimal detector pixel size is presented as application-specific, depending on flux, desired spectral information, spatial resolution, and count-rate requirements, with simulations positioned as a key design tool. Rad AI is portrayed as offering spectral CT simulation and detector optimization services, which may indicate a focus on becoming an enabling technology provider for medical imaging and inspection OEMs.
For investors, this emphasis on detailed detector-physics modeling could signal a strategy to embed Rad AI’s tools or services into upstream design workflows for photon-counting CT and NDT systems. If the company can convert this technical expertise and validated simulation results into commercial design contracts or software licensing, it could benefit from long product lifecycles and rising industry interest in photon-counting imaging.

