Bayesian Extraction of Jet Energy Loss Distributions in Heavy-Ion Collisions

简介: 笔记

1.问题

根据统计学模型来拟和[alpha,beta,gamma]这三个参数,在这里使用的是贝叶斯分析方法以及MCMC方法来作为模型。我们假设这三个参数的先验分布为均匀分布,介质能量损失为均匀分布。


2.工作流程


1数据准备

2模型构造

3拟和模型

4得出拟和参数

具体的过程在代码会非常清晰明了。

import numpy as np
import matplotlib.pyplot as plt
import pymc3 as pm
from pymc3 import Uniform, Normal
from scipy.special import gamma
from scipy.interpolate import interp1d
from scipy import optimize
def read_data():   //读取数据
    dat = np.loadtxt("./RAA_2760.txt")
    return dat[:,0],dat[:,1]
pp_x,pp_y = read_data()
gala30x = np.array([
0.047407180540804851462, 0.249923916753160223994,
0.614833454392768284613, 1.143195825666100798284,
1.836454554622572291486, 2.69652187455721519578,
3.72581450777950894932, 4.92729376584988240966,
6.304515590965074522826, 7.861693293370260468756,
9.603775985479262079849, 11.5365465979561397008,
13.6667446930642362949, 16.00222118898106625462,
18.55213484014315012403, 21.32720432178312892793,
24.34003576453269340098, 27.60555479678096102758,
31.14158670111123581829, 34.96965200824906954359,
39.11608494906788912192, 43.61365290848482780666,
48.50398616380420042732, 53.84138540650750561747,
59.69912185923549547712, 66.18061779443848965169,
73.44123859555988223951, 81.73681050672768572223,
91.5564665225368382555, 104.1575244310588945118
])
gala30w = np.array([
0.121677895367261782329, 0.283556882734938525493,
0.446432426678773424704, 0.610532130075812839931,
0.77630347862220587813, 0.944233288641719426021,
1.11484470167521153566, 1.288705432832675646608,
1.466439137624214862, 1.64873949785431911827,
1.83638767787041103674, 2.03027425167718247131,
2.23142718244322451682, 2.44104811309112149776,
2.66056024337508997273, 2.89167264137737623469,
3.13646833382438459, 3.3975275957906408941,
3.67810472956067256012, 3.9823886239009637486,
4.315899244899456113557, 4.686114009126298021,
5.1035026514834184093, 5.58333298168755125349,
6.1490444086574353235, 6.8391305457947582711,
7.7229478770061872416, 8.9437424683710389523,
10.87187293837791147282, 15.026111628122932986
])
class RAA2ELoss():    
    def __init__(self,RAA_fname,pp_pt=None,pp_spectra=None):
        self.with_ppdata = False
        if not(pp_pt is None or pp_spectra is None):
            self.pp_fit = interp1d(pp_pt,pp_spectra,fill_value="extrapolate")
            self.with_ppdata = True
        RAA_data = np.loadtxt(RAA_fname)
        self.RAA_x = RAA_data[:,0]
        self.RAA_xerr = RAA_data[:,1]
        self.RAA_y = RAA_data[:,2]
        self.RAA_yerr = RAA_data[:,3]
    def mean_ptloss(self, pt, b, c):
        return b*pt**c*np.log(pt)
    def model(self):
        with pm.Model() as basic_model:
            a = Uniform('a', lower=0, upper=10)
            b = Uniform('b', lower=0, upper=10)
            c = Uniform('c', lower=0, upper=1)
            intg_res = np.zeros_like(self.RAA_x)
            for i, x in enumerate(gala30x):
                scale_fct = self.RAA_x / gala30x[-1]
                x = x * scale_fct
                shifted_pt = self.RAA_x + x
                mean_dpt = self.mean_ptloss(shifted_pt, b, c)
                alpha = a
                beta = a / mean_dpt
                pdpt = (beta**(alpha) / (alpha)) * np.exp(-beta*x) * x**(alpha-1)
                intg_res += scale_fct*gala30w[i]*pdpt*self.pp_fit(shifted_pt)
               #上面两行代码不是很懂
            mu = intg_res / self.pp_fit(self.RAA_x)
            likelihood_y = Normal('likelihood_y', mu=mu, observed=self.RAA_y)
            start = pm.find_MAP(fmin=optimize.fmin_powell)
            step = pm.Metropolis()
            trace = pm.sample(1200,start=start,step=step)

ps: 代码和数据参考github,他上面的pymc的版本是1,如果使用pymc3的话将会报错。

该GitHub的流程图。省略了画图与保存数据的部分。

1.png

公众号:FPGA之旅

目录
相关文章
|
8月前
|
机器学习/深度学习 人工智能 自然语言处理
OneIE:A Joint Neural Model for Information Extraction with Global Features论文解读
大多数现有的用于信息抽取(IE)的联合神经网络模型使用局部任务特定的分类器来预测单个实例(例如,触发词,关系)的标签,而不管它们之间的交互。
107 0
|
8月前
|
机器学习/深度学习 人工智能 算法
【5分钟 Paper】Reinforcement Learning with Deep Energy-Based Policies
【5分钟 Paper】Reinforcement Learning with Deep Energy-Based Policies
|
8月前
|
机器学习/深度学习 开发框架 数据建模
HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction
远程监督假设任何包含相同实体对的句子都反映了相同的关系。先前的远程监督关系抽取(DSRE)任务通常独立地关注sentence-level或bag-level去噪技术
99 0
|
8月前
|
监控
DFNet: Enhance Absolute Pose Regression withDirect Feature Matching
DFNet: Enhance Absolute Pose Regression withDirect Feature Matching
79 0
|
10月前
PointNet++:Deep Hierarchical Feature Learning on Points Sets in a Metrci Space 学习笔记
PointNet++:Deep Hierarchical Feature Learning on Points Sets in a Metrci Space 学习笔记
54 0
|
机器学习/深度学习 编解码 算法
Self-Training using Selection Network for Semi-supervised Learning
Self-Training using Selection Network for Semi-supervised Learning
114 0
Self-Training using Selection Network for Semi-supervised Learning
|
数据挖掘
Deep Dive into Bayesian Classification Algorithm
A lot of our online experience depends significantly on machine learning algorithms. One such algorithm is Bayesian Classification Algorithm.
1392 0
Deep Dive into Bayesian Classification Algorithm
Predicting Heart Diseases with Machine Learning
Heart disease is a major cause of death, affecting over one-third of the world's population. In China, hundreds of thousands of people die of heart disease every year.
3006 0
|
TensorFlow 算法框架/工具 Python
Tutorials on training the Skip-thoughts vectors for features extraction of sentence.
Tutorials on training the Skip-thoughts vectors for features extraction of sentence.      1. Send emails and download the training dataset.
1778 0
|
机器学习/深度学习 自然语言处理 计算机视觉