steps: | |
- task: UseDotNet@2 | |
displayName: 'Install .NET Core 6.0.x SDK' | |
inputs: | |
version: 6.0.x | |
performMultiLevelLookup: true |
Monday, 31 January 2022
Updating from .NET Core 3.2 to .NET 6.0
Saturday, 22 January 2022
Running ZEUS Analysis Code
Although I built CERNLib a long time ago, apart from running PAW and looking at some old Ntuples I didn't get round to building my old ZEUS analysis code.
Setup the Environment
Code Changes
.EQ. to .EQV.
Using 1 and 0 for True and False
DFLIB
Makefile
.inc Files
c ------------------------------------------------- | |
c Local definitions for F2/ISR analysis | |
c ------------------------------------------------- | |
c Kinematical variables | |
c ------------------------------------------------- | |
real empzcal, empztot, y_el, y_sig, y_elcorr, corrected_en, | |
& best_th, Q2_el, x_el, bestx, besty, bestz, electron_en, | |
& electron_th, logyel, logxel, logq2elc, logysig, logq2el, | |
& logmcq2, logmcx, logmcy, elumie, elumig, q2corr, | |
& zempzhad, zy_jb, ccy_jb, zy_jbcorr, | |
& logyjbz, best_th2, gamma,gamma2,z,logmcy2 |
How to run the Analysis code
- Steering cards. This contains a list of cuts made to select the events. Different cards would define different cuts and therefore allow me to analyse systematic errors
- File listing input Data .rz ntuples
- File listing input Background .rz ntuples
- File listing input MonteCarlo .rz ntuples
- stc97_n for the steering cards
- fort.45 for the Data file list
- fort.46 for the Background file list
- fort.47 for the MonteCarlo file list
I should make clear at this point that the files I'm running against are not the "raw" ZEUS datafiles. A preliminary job was run against the full set of data on tape to load a cutdown set of data that passed some loose cuts and create an ntuple with the necessary fields for this particular analysis. My ntuples were saved after this first step.
Results
- Run the PAW macro (kumac) files I used after this analysis step and add to GitHub.
- Run against the full data set. I want to see how quickly this runs on current hardware. To run the full 96/97 data set against all steering cards would take about 20 hours!
- Create a GitHub release to include sample .rz files. Hopefully to allow anyone to run this!
- Try and build the 64bit version of CERNLIB so I don't have to run on a 32bit VM.
Wednesday, 12 January 2022
Adding Azure Pipeline status to GitHub README.md
I noticed in some GitHub projects a build status badge in the README.md display. As I have Azure Pipeline builds setup for two of my projects, CSVComparer and SuperMarketPlanner, I thought it would a good idea to add this.
Handily, there is a REST API interface available for the Azure Pipeline build!
The link for the badge is of the form:
https://dev.azure.com/{Organisation}/{Project}/_apis/build/status/{pipelinename}
For CSVComparer that's:
https://dev.azure.com/jonathanscott80/CSVComparer/_apis/build/status/jscott7.CSVComparer
Then I can add a hyperlink to the badge to navigate to the latest build page when clicked. This is of the form:
https://dev.azure.com/{Organisation}/{Project}/_build/latest?definitionId={id}
Again, for CSVComparer that's:
https://dev.azure.com/jonathanscott80/CSVComparer/_build/latest?definitionId=2
The definitionId is the ID of the Pipeline assigned by Azure. You can find this by navigating through to the Pipeline page via https://dev.azure.com/{Organisation}/
(Useful link for the official docs here : Azure Pipelines documentation | Microsoft Docs)
Finally, to put it all together add this to the README.md file:
[](https://dev.azure.com/jonathanscott80/CSVComparer/_build/latest?definitionId=2)
And clicking on the badge takes us here: