8 Hi_Panda_CRL

尚未进行身份认证

暂无相关描述

等级
博文 241
排名 1w+

Caffe GNU error

***Abortedat1491880114(unixtime)try"date-d@1491880114"ifyouareusingGNUdate***PC:@0x7fefd5f82cde(unknown)***SIGSEGV(@0x0)receivedbyPID2769(TID0x7fefea08ca40)fromPID0;

2017-04-11 11:11:35

install opencv with linux

ThefollowingstepshavebeentestedforUbuntu10.04butshouldworkwithotherdistrosaswell.RequiredPackagesGCC4.4.xorlaterCMake2.8.7orhigherGitGTK+2.xorhigher,includingheaders

2017-04-09 09:21:17

Add sudo authority

12

2017-04-09 09:10:22

Caffe Layer Library

Convolutionlayer#convolutionlayer{name:"loss1/conv"type:"Convolution"bottom:"loss1/ave_pool"top:"loss1/conv"param{lr_mult:1decay_mult:1}param{lr_mult:2

2017-03-14 15:22:33

Caffe Log Visualization

1.Recordyourtrain/testlogasalogfileTOOLS=./build/toolsGLOG_logtostderr=0GLOG_log_dir=deepid/deepid2/Log/\$TOOLS/caffetrain\--solver=deepid/deepid2/deepid_solver.prototxt2.Parse

2016-09-28 14:28:41

Caffe --- SyncedMemory

SyncedMemory类定义在syncedmem.hpp/cpp里,主要负责caffe底层的内存管理.PS:Caffe的底层数据的切换(cpu模式和gpu模式),需要用到内存同步模块。其实个人觉得如果需要研究Blob,对于SyncedMemory的分析很重要内存分配与释放内存分配与释放由两个(不属于SyncedMemory类的)内联函数完成.代码简单直观:如果是CPU模式,那么调用m

2016-07-25 09:59:49

Caffe --- blob code

两篇非常好的文章:http://blog.csdn.net/xizero00/article/details/50886829#http://www.cnblogs.com/yymn/articles/5341347.html

2016-07-23 13:02:21

Caffe Solver

Solverscaffoldstheoptimizationbookkeepingandcreatesthetrainingnetworkforlearningandtestnetwork(s)forevaluation.iterativelyoptimizesbycallingforward/backwardandupdatingp

2016-07-21 15:46:39

Blobs, Layers, and Nets: anatomy of a Caffe model

BlobsBlob作为Caffe的四大模块之一,负责完成CPU/GPU存储申请、同步和数据持久化映射。Caffe内部数据存储和通讯都是通过Blob来完成,Blob提供统一的存储操作接口,可用来保存训练数据、模型参数等。Blob事实上是调用了SyncedMemory类。SyncedMemory类封装了CPU/GPU内存申请、同步和释放等。所以SyncedMemory完成了对内存的实际操作。

2016-07-21 15:39:26

Python learning

PackagesareawayofstructuringPython’smodulenamespacebyusing“dottedmodulenames”

2016-07-21 11:57:15

Python layer

FullyConvolutionalNetworksforSemanticSegmentation论文中公布的代码作为示例,解释python层该怎么写。https://github.com/shelhamer/fcn.berkeleyvision.orgFirstyouhavetobuildCaffewithWITH_PYTHON_LAYERoption1.R

2016-07-20 11:04:07

caffe net visualization

net.blobs.items()存储了预测图片的网络中各层的featuremap的数据。net.params.items()存储了训练结束后学习好的网络参数。vis_square函数视觉化data,主要是进行数据归一化,data转换为plt可视化的square结构。plt.imshow(net.deprocess(‘data’,net.blobs[‘data’].data[4]))

2016-07-19 18:20:02

caffe net implement

#include<algorithm>#include<map>#include<set>#include<string>#include<utility>#include<vector>#include"caffe/common.hpp"#include"caffe/layer.hpp"#include"caffe/net.hpp"#include"caffe/

2016-07-19 17:28:31

caffe interface --- python

#include<Python.h>//NOLINT(build/include_alpha)//Producedeprecationwarnings(needstocomebeforearrayobject.hinclusion).#defineNPY_NO_DEPRECATED_APINPY_1_7_API_VERSION#include<boost/make_

2016-07-19 17:17:04

caffe interface --- matlab

本文首先介绍一些基础的入门知识,然后分析一个大型工程应用caffe_,从工程的视角分析,该如何设计好一个大型的交互接口。同时,找到matlab性能的瓶颈,正是我们需

2016-07-19 15:19:16

郑帅师兄的五年博士总结

这五年最重要的,是渐渐知道了怎么去做一件比较大的事情。说得很大,其实就一点,心要静下来。首先,心静下来才能钻进某个领域里认真做事。现在的社交媒体太多了,各类新闻也太多,每天忙于应付这些广泛却又浅薄的信号,或是忙着去评点别人,是没有办法做成一件事情的。就比如一个人要去旅游,按图索骥地走一圈著名的景点,并不会给自己新的体悟,最多只增些与人的谈资而已。真要体会大自然的美丽,那是一定要涉足别人达不到的地方,

2016-07-08 15:58:26

factor graph,potential function,Template models

factor是对于variables的某种combination的fitness。在BN中factor就是conditionalprobabilitydistribution(CPD);但factor并不总对应着某种概率(当然也不一定取0~1),比如说在MRF中。和数据库table的操作类似,factor上的基本操作有factorproduct,factormarginalization

2016-07-05 17:23:26

dense_CRF

/*Copyright(c)2013,PhilippKrähenbühlAllrightsreserved.Redistributionanduseinsourceandbinaryforms,withorwithoutmodification,arepermittedprovidedthatthefollowi

2016-06-30 10:44:32

deeplab script---python

importos,sys,subprocesssys.path.insert(0,os.getcwd()+'/python/my_script/')fromtesterimporttesterfromtrainerimporttrainerfromcrf_runnerimportcrf_runner,grid_searchimporttools#MO

2016-06-01 07:53:21

deepLab

1.matiocan'tfindHDF5librarieschangefile/densecrf/makefileas:g++refine_pascal_v4/dense_inference.cpputil/Timer.hlibDenseCRF.a$(CC)refine_pascal_v4/dense_inference.cpp-oprog_refine

2016-05-30 07:20:29
奖章
    暂无奖章