I think there are a couple of threads about this situation, as well as the author's blog. You can google it with keywords "backtrader multiple datafeeds synchronization".
My understanding is that the last available value of the datafeeds with scattered dates will be repeatedly delivered to cerebro as the clock progresses, assuming nothing special is being done. One solution is keep a counter of the length of each datafeed in your next() or prenext() function. When there is missing data, the length of the datafeed will not increase , even if the stale value is delivered again. Something like this:

def __init__(self): # datafeed pointer, keeping track if any datafeeds delivery new data or not self.df_pt = dict() for i, d in enumerate(self.datas): self.df_pt[d._name] = 0 def next(self): # check if new data has been delivered for current datafeed for i, d in enumerate(self.datas): dt, dn = self.datetime.date(), d._name if len(d) <= self.df_pt[dn]: pass # or any other actions you desire else: self.df_pt[dn] = len(d)