Skip to content

Warm attenuation vs. adaptive lighting?

Warm attenuation vs. adaptive lighting?

I am building a house and I intend to use Lutron HomeKit drives for all switches. I’m a fan of Lutron’s radio frequency reliability. If they provided adaptive lighting with their dimmers, I would go with that, but realistically it won’t be around when we build our house this year, so I’m looking for the next best thing. I recently heard about warm dim lights that may exist. The color temperature will change from warm light with low brightness to cold light with maximum brightness. Obviously, if color and brightness can be controlled independently, this would be perfect (adaptive lighting), but how bad would it be to use warm opacity as an alternative? I imagine that I have an automation to slowly dim the lights that are on during sunset and do the opposite at sunrise. Ideally, I can also set the brightness level for lights that are not on, so when they are on, they are displayed at the correct setting right on the stick. Although I’m not sure how to solve the manual changes 🤔 I can only ignore lights that have already been set to a manual temperature if they are on. Has anyone tried this?