CMS Pixel Detector Miscellaneous
Phase 1 Phase 2
Layer 1 Replacement Layers 2-4
  Layer 1 Replacement Elog, Page 3 of 13  Not logged in ELOG logo
Entry  Mon May 11 21:37:43 2020, Dinko Ferencek, Software, Fixed the BB defects plots in the production overview page 
0407e04c: attempting to fix the BB defects plots in the production overview page (seems mostly related to the 17 to 10 C change)
f2d554c5: it appears that BB2 defect maps were not processed correctly
Entry  Mon May 11 14:40:16 2020, danek kotlinski, Other, M1582 m1582_roc1_thr_1d.pngm1582_roc1_thr_2d.pngm1582_roc1_ph70.png
On Friday I have tested the module M1582 at room temperature in the blue box.
The report in MoreWeb says that this module has problems with trimming 190 pixels in ROC1.

I see not problem in ROC1. The average threshold is 50 with rms=1.37. Only 1 pixel is in the 0 bin.
See the attached 1d and 2d plots.

Also the PH looks good. The vcal 70 PH map is reconstructed at vcal 70.3 with rms of 3.9.
5159 pixels have valid gain calibrations.

I conclude that this module is fine.
Maybe it is again a DTB problem, as reported by Andrey.
D.
Entry  Mon May 11 14:14:05 2020, danek kotlinski, Other, M1606 m1606_roc2_thr_1d.pngm1606_roc2_thr_2d.pngm1606_roc2_ph70.png
On Friday I have tested M1606 at room temperature in the red cold box.
Previously it was reported that trimming does not work for ROC2.

In this test trimming was fine, only 11 pixels failed it.
See the attached 1D and 2D histograms. There is small side peak at about vcal=56 with ~100 pixels.
But this should not be a too big problem?

Also the Pulse height map looks good and the reconstructed pulse height at vcal=70
gives vcal=68.1 with rms=4.2, see the attached plot.

So I conclude that this module is fine.
Entry  Mon May 11 13:19:51 2020, Andrey Starodumov, Cold box tests, M1539 
After several attempts including reconnecting the cable, M1539 had no readout if it's connected to TB3. When connected to TB1, M1539 did not show any problem. M1606 worked properly both with TB1 and TB3.
For FT test the configuration is following:
TB1: M1539
TB3: M1606
Entry  Thu May 7 01:51:03 2020, Dinko Ferencek, Software, Strange bug/feature affecting Pixel Defects info in the Production Overview page Production_Overview_Pixel_defects_problem.png
It was observed that sometimes the Pixel Defects info in the Production Overview page is missing



It turned out this was happening for those modules for which the MoReWeb analysis was run more than once. The solution is to remove all info from the database for the affected modules

python Controller.py -d

type in, e.g. M1668, and when prompted, type in 'all' and press ENTER and confirm you want to delete all entries. After that, run

python Controller.py -m M1668

followed by

python Controller.py -p

The missing info should now be visible.
Entry  Thu May 7 00:56:50 2020, Dinko Ferencek, Software, MoReWeb updates related to the BB2 test 9x
Andrey noticed that results of the BB2 test (here example for ROC 12 in M1675)



were not properly propagated to the ROC Summary



This was fixed in d9a1258a. However, looking at the summary for ROC 5 in the same module after the fix





it became apparent that dead pixels were double-counted under the dead bumps despite the fact they were supposed to be subtracted here. From the following debugging printout
Chip 5 Pixel Defects Grade A
        total:    5
        dead:     2
        inef:     0
        mask:     0
        addr:     0
        bump:     2
        trim:     1
        tbit:     0
        nois:     0
        gain:     0
        par1:     0
        total: set([(5, 4, 69), (5, 3, 68), (5, 37, 30), (5, 38, 31), (5, 4, 6)])
        dead:  set([(5, 37, 30), (5, 3, 68)])
        inef:  set([])
        mask:  set([])
        addr:  set([])
        bump:  set([(5, 4, 69), (5, 38, 31)])
        trim:  set([(5, 4, 6)])
        tbit:  set([])
        nois:  set([])
        gain:  set([])
        par1:  set([])

it became apparent that the column and row addresses for pixels with bump defects were shifted by one. This was fixed in 415eae00



However, there was still a problem with the pixel defects info in the production overview page which was still using using the BB test results



After switching to the BB2 tests results in ac9e8844, the pixel defects info looked better



but it was still not in a complete sync with the info presented in the FullQualification Summary 1



This is due to double-counting of dead pixels which still needs to be fixed for the Production Overview.
Entry  Thu May 7 00:27:41 2020, Dinko Ferencek, Module grading, Comment about TrimBitDifference and its impact on the Trim Bit Test 
To expand on the following elog, on Mar. 24 Andrey changed the TrimBitDifference parameter in Analyse/Configuration/GradingParameters.cfg from 2 to -2
$ diff Analyse/Configuration/GradingParameters.cfg.default Analyse/Configuration/GradingParameters.cfg
45c45
< TrimBitDifference = 2.
---
> TrimBitDifference = -2.

From the way this parameter is used here, one can see from this line that setting the TrimBitDifference to any negative value effectively turns off the test.

More details about problems with the Trim Bit Test can be found in this elog.
Entry  Thu Apr 30 16:47:00 2020, Matej Roguljic, Software, MoReWeb empty DAC plots 
Some of the DAC parameters plots were empty in the total production overview page. All the empty plots had the number "35" in them (e.g. DAC distribution m20_1 vana 35). The problem was tracked down to the trimming configuration. Moreweb was expecting us to trim to Vcal 35, while we decided to trim to Vcal 50. I "grepped" where this was hardcoded and changed 35->50.

The places where I made changes:
  • Analyse/AbstractClasses/TestResultEnvironment.py
    'trimThr':35
  • Analyse/Configuration/GradingParameters.cfg.default
    trimThr = 35
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    TrimThresholds = ['', '35']
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    self.SubPages.append({"InitialAttributes" : {"Anchor": "DACDSpread35", "Title": "DAC parameter spread per module - 35"}, "Key": "Section","Module": "Section"})


It's interesting to note that someone had already made the change in "Analyse/Configuration/GradingParameters.cfg"
    Reply  Thu Apr 30 17:24:57 2020, Dinko Ferencek, Software, MoReWeb empty DAC plots 

Matej Roguljic wrote:
Some of the DAC parameters plots were empty in the production overview page. All the empty plots had the number "35" in them (e.g. DAC distribution m20_1 vana 35). The problem was tracked down to the trimming configuration. Moreweb was expecting us to trim to Vcal 35, while we decided to trim to Vcal 50. I "grepped" where this was hardcoded and changed 35->50.

The places where I made changes:
  • Analyse/AbstractClasses/TestResultEnvironment.py
    'trimThr':35
  • Analyse/Configuration/GradingParameters.cfg.default
    trimThr = 35
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    TrimThresholds = ['', '35']
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    self.SubPages.append({"InitialAttributes" : {"Anchor": "DACDSpread35", "Title": "DAC parameter spread per module - 35"}, "Key": "Section","Module": "Section"})


It's interesting to note that someone had already made the change in "Analyse/Configuration/GradingParameters.cfg"


As far as I can remember, the changes in Analyse/AbstractClasses/TestResultEnvironment.py, Analyse/Configuration/GradingParameters.cfg.default and Analyse/Configuration/GradingParameters.cfg were there from before, probably made by Andrey. It is possible that you looked at the files when I was preparing logically separate commits affecting the same files which required temporarily undoing and later reapplying some of the changes to be able to separate the commits. The commits are now on GitLab https://gitlab.cern.ch/CMS-IRB/MoReWeb/-/commits/L1replacement, specifically:

435ffb98: grading parameters related to the trimming threshold updated from 35 to 50 VCal units
1987ff18: updates in the production overview page related to a change in the trimming threshold
    Reply  Thu Apr 30 17:33:04 2020, Andrey Starodumov, Software, MoReWeb empty DAC plots 

Matej Roguljic wrote:
Some of the DAC parameters plots were empty in the total production overview page. All the empty plots had the number "35" in them (e.g. DAC distribution m20_1 vana 35). The problem was tracked down to the trimming configuration. Moreweb was expecting us to trim to Vcal 35, while we decided to trim to Vcal 50. I "grepped" where this was hardcoded and changed 35->50.

The places where I made changes:
  • Analyse/AbstractClasses/TestResultEnvironment.py
    'trimThr':35
  • Analyse/Configuration/GradingParameters.cfg.default
    trimThr = 35
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    TrimThresholds = ['', '35']
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    self.SubPages.append({"InitialAttributes" : {"Anchor": "DACDSpread35", "Title": "DAC parameter spread per module - 35"}, "Key": "Section","Module": "Section"})


It's interesting to note that someone had already made the change in "Analyse/Configuration/GradingParameters.cfg"

I have changed
1)StandardVcal2ElectronConversionFactorfrom 50 to 44 for VCal calibration of PROC600V4 is 44el/VCal
2)TrimBitDifference from 2 to -2 for not to take into account failed trim bit test that is an artifact from trimbit test SW.
       Reply  Thu May 7 00:10:15 2020, Dinko Ferencek, Software, MoReWeb empty DAC plots 

Andrey Starodumov wrote:

Matej Roguljic wrote:
Some of the DAC parameters plots were empty in the total production overview page. All the empty plots had the number "35" in them (e.g. DAC distribution m20_1 vana 35). The problem was tracked down to the trimming configuration. Moreweb was expecting us to trim to Vcal 35, while we decided to trim to Vcal 50. I "grepped" where this was hardcoded and changed 35->50.

The places where I made changes:
  • Analyse/AbstractClasses/TestResultEnvironment.py
    'trimThr':35
  • Analyse/Configuration/GradingParameters.cfg.default
    trimThr = 35
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    TrimThresholds = ['', '35']
  • Analyse/OverviewClasses/CMSPixel/ProductionOverview/ProductionOverviewPage/ProductionOverviewPage.py
    self.SubPages.append({"InitialAttributes" : {"Anchor": "DACDSpread35", "Title": "DAC parameter spread per module - 35"}, "Key": "Section","Module": "Section"})


It's interesting to note that someone had already made the change in "Analyse/Configuration/GradingParameters.cfg"

I have changed
1)StandardVcal2ElectronConversionFactorfrom 50 to 44 for VCal calibration of PROC600V4 is 44el/VCal
2)TrimBitDifference from 2 to -2 for not to take into account failed trim bit test that is an artifact from trimbit test SW.


1) is committed in 74b1038e.
2) was made on Mar. 24 (for more details, see this elog) and is currently left in Analyse/Configuration/GradingParameters.cfg and might be committed in the future depending on what is decided about the usage of the Trim Bit Test in module grading
$ diff Analyse/Configuration/GradingParameters.cfg.default Analyse/Configuration/GradingParameters.cfg
45c45
< TrimBitDifference = 2.
---
> TrimBitDifference = -2.

There were a few other code updates related to a change of the warm test temperature from 17 to 10 C. Those were committed in 3a98fef8.
Entry  Wed May 6 16:24:21 2020, Andrey Starodumov, Full test, FT of M1580, M1595, M1606, M1659 
M1580: Grade B due to mean noise >200e in ROC5/8 and trimming failures for 100+ pixels in the same ROCs at +10C, previous result of April 27 was better
M1595: Grade B due to mean noise >200e in few ROCs, previous result of April 30 was much worse with 80/90 pixels failed trimming in ROC0 and ROC15
M1606: Grade C due to 192 pixels failed trimming in ROC2 at +10C, previous result of April 6 was much better with B grade
M1659: Grade B due to mean noise >200e in few ROCs, previous result of Aplirl 7 was almost the same

M1606 to tray C* for further investigation
Entry  Wed May 6 13:20:28 2020, Andrey Starodumov, Full test, FT of M1574, M1581, M1660, M1668 
Modules tested on May 5th
M1574: Grade B due to mean noise >200e in ROC10 and trimming failures for 89 pixels in ROC0, the same as the first time April 24 (there 104 pixels failed)
M1581: Grade B due to mean noise >200e in ROC8/13, no trimming failures in ROC8/13, as it was on April 27 (120+ pixel in ROC8/13 failed) -> Resalts improved!
M1660: Grade B due to mean noise >200e in few ROCs, no more trimming failure for 172 pixels in ROC7 as it was on April 7 in ROC7 -> Results improved!
M1668: Grade B due to mean noise >200e in few ROCs results are worse than were on April 14: one more ROC with mean noise > 200e

resume: for 2 modules results are improved for 2 others almost the same
Entry  Tue May 5 13:58:45 2020, Andrey Starodumov, Full test, FT of M1582, M1649, M1667 
M1582: Grade C due to trimming failure in ROC1 for 189 pixels at +10C. This is third time module restesed:
1) February 26 (trimming for VCal 40 and old PH optimization): Grade B, max 29 failed pixels and in few ROCs mean noise
2) April 27: Grade C due to trimming failure in ROC1 for 167 pixels at +10C, at -20C still max 45 failed pixels and in few ROCs mean noise
3) March 5: Grade C due to trimming failure in ROC1 for 189 pixels at +10C, at -20C trimming failure in ROC1 for 157 pixels
The module quality getting worse.

M1649: Grade B due to mean noise >200e in ROC11
M1667: Grade B due to mean noise >200e in few ROCs

M1582 is in C* tray. To be investigated.
Entry  Mon May 4 15:28:14 2020, Andrey Starodumov, General, M1660  
M1660 is taken from gel-pak and cabled for retest.
This module was graded C only at second FT at-20C, the first FT at -20C and FT at +10C give grade B. Massive trimming failure of pixels in ROC7 was not observed.
The module will be retested.
Entry  Mon May 4 14:18:20 2020, Andrey Starodumov, Full test, FT of M1540, 1549, 1571, 1598 
M1540: Grade A
M1549: Grade B due to mean noise >200e for ROC2 and 48 dead pixels in ROC5
M1571: Grade B due to mean noise >200e for many ROCs
M1598: Grade B due to mean noise >200e for a few ROCs
Entry  Mon May 4 14:13:39 2020, Andrey Starodumov, Full test, FT of M1552, M1553, M1595, M1597 
FT on April 30th
M1552: Grade B due to mean noise >200e for ROC7,8
M1553: Grade B due to mean noise >200e for a few ROCs
M1595: Grade B due to mean noise >200e for a few ROCs at -20C and the same + trimming failes for ROC0 (82 pixels) and ROC15 (94 pixels)
M1597: Grade B due to mean noise >200e for a few ROCs
Entry  Fri May 1 19:34:01 2020, danek kotlinski, Module grading, M1582 m1582_roc1_thr.pngm1582_roc1_thr_2d.png
M1582 was classified as C because of 167 pixels failing trimming in ROC1.
I have tested this module.
The attached plots show the 1d & 2d threshold distributions.
The average threshold is 49.98 with rms=1.39 there is 1 pixel failing (at 0) and 1 pixel with a very low threshold of 37.
I think this ROC is OK, actually it is very nice.
D.
Entry  Thu Apr 30 15:38:43 2020, danek kotlinski, Module transfer, M1635 & M1671 transferred to gel-pack 
Two bad modules have been placed in gel-packs: 1635 & 1671.
Entry  Thu Apr 30 15:25:36 2020, Andrey Starodumov, Full test, FT of M1548, M1549, M1550, M1551 
M1548: Grade B due to mean noise >200e for ROC11
M1549: Grade B due to mean noise >200e for ROC2. In total 200+ pixels failed trimming in the module -> investigate???
M1550: Grade B due to mean noise >200e for ROC5
M1551: Grade B due to mean noise >200e for a few ROCs

M1549 in tray C* for investigation
Entry  Thu Apr 30 15:16:58 2020, Andrey Starodumov, Full test, FT of M1540, M1541, M1543, M1547 
Modules tested om April 29
M1540: Grade B due to many (>1000) pixels failed trimming but only 70 are in "C-zone" for ROC0 at -20C -> retest!!!
M1541: Grade B due to mean noise >200e for a few ROCs
M1543: Grade B due to mean noise >200e for ROC8 and 30+ damaged bumps in ROC14
M1547: Grade A

M1540 in C* tray for retest
Entry  Wed Apr 29 18:11:36 2020, Andrey Starodumov, Full test, FT of M1590, M1592, M1596, M1600 
Modules tested om April 28
M1590: Grade B Grade B due to mean noise >200e for a few ROCs
M1592: Grade B Grade B due to mean noise >200e for a few ROCs
M1596: Grade B Grade B due to mean noise >200e for a few ROCs
M1600: Grade B Grade B due to mean noise >200e for a few ROCs
Entry  Wed Apr 29 14:08:42 2020, Andrey Starodumov, Full test, FT of M1536, M1537, M1538 
M1536: Grade B due to mean noise >200e for ROC1
M1537: Grade B due to mean noise >200e for a few ROCs
M1538: Grade B due to mean noise >200e for a few ROCs and trimming failure for 70 pixels in ROC14 at -20C
ELOG V3.1.3-7933898