In CATV land, levels are expressed in dBmV, Decibels relative to 1 microVolt, as-opposed to the dBm(deciBels relative to 1 milliWatt that all the other industries use.
I mistakenly used dBm due to habit. Given my radio background, I never got used to expressing the dBmV units during my time in CATV.
I'm super curious as to what other changes the tech performed, as this change in upstream levels cannot alone be caused by a "filter" itself. Perhaps the filter had caused high downstream loss, and connections had been arranged such as to bring a high amount of system gain to the device compensating for the downstream loss, then ending up with such low loss upstream? I know as a tech I was always incentivised to oversimplify my explanations to the customer of what I had done, so as to avoid boggling their mind and reduce my time spent yakking.
Low transmit levels are generally pretty rare, as the designers are motivated to provide only the minimum return-path gain to allow successful connections under the industry-standard range of valid trunk-tap configurations, which keeps noise - and therefore outages - to a minimum. The only times I saw really LOW transmit levels, were in odd situations, such as at the very far-end very old outside plant cabling, where the downstream levels were highly attenuated by the old cable rated for only around 400MHz, the return losses were very low since low frequencies only require very modest performance from any coax, and Tap values were allowed to be very low such as only 4 or 6dB, which you won't find in any modern designed and maintained system. Now the min tap value is about 10. An example that comes to mind of such a situation was in a place where most of the existing outside plant was still original direct-burial 400MHz cable(RG-11), with a mix of old and new taps that were only touched when there was any sort of repair and original old parts were obviously not available. I would get calls for either low TX modems providing terrible service, or else I'd be there for terrible CATV digital reception on channels that were transported at high frequencies such as 600+MHz, or both. I had to use an expensive and delicate mix of upstream attenuation devices, and counter-slope amplification on-premesis to get my re-call rates for that town down to an acceptable level. That town also happened to have a lot of exceptional install scenarios where homes were located on enormous property, with a long drop line, as well as exceptionally low development density(acreages for rich people, business elite), where the usual formulas for outside plant design went right out the window. TL;DR I speak from experience.
Modems cannot just "increase output" until their signal gets sufficient SNR, that would hammer the receiver at the CMTS with signals exceeding it's dynamic range. The ranging process what modems must go through, allows the devices to train their transit-time through the system so that their TDMA signals arrive on-time and don't interfere with their neighbors, as well it establishes the initial level needed to reach the specified receive level(power level) . AGC cannot be used at the CMTS because of the fact that TDMA is used, so all levels must arrive reasonably equally power-wise thus the protocol endures this. After ranging is complete, periodic feedback is provided to the modems from the CMTS to make fine adjustments necessary to maintain the channel(s).