3 haohulala

尚未进行身份认证

我要认证

幼儿园中班在读

等级
TA的排名 1w+

《Multiple-Operation Image Anti-Forensics with WGAN-GP Framework 》代码复现

这是什么因为一个比赛,我们去看文献,然后恰好读到这篇论文,名字叫《Multiple-Operation Image Anti-Forensics with WGAN-GP Framework 》,简单来说就是对于一张篡改图片,我们使用多种后处理方法隐藏篡改痕迹,然后再使用WGAN网络隐藏痕迹,让篡改真正做到无迹可寻(实际的欺骗率大概在50%左右)。网络结构这就是论文中给出的图片结构,还是比较清晰的。损失函数损失函数分为两个:生成器和判别器。首先是生成器的损失函数LG=Ex‘[αLGpixel

2020-09-17 12:21:27

1051 Pop Sequence (25分)

Given a stack which can keepMnumbers at most. PushNnumbers in the order of 1, 2, 3, ...,Nand pop randomly. You are supposed to tell if a given sequence of numbers is a possible pop sequence of the stack. For example, ifMis 5 andNis 7, we can obta...

2020-08-29 09:24:36

1023 Have Fun with Numbers (20分)

Notice that the number 123456789 is a 9-digit number consisting exactly the numbers from 1 to 9, with no duplication. Double it we will obtain 246913578, which happens to be another 9-digit number consisting exactly the numbers from 1 to 9, only in a diffe

2020-08-14 15:41:12

1004 Counting Leaves (30分)

A family hierarchy is usually presented by a pedigree tree. Your job is to count those family members who have no child.Input Specification:Each input file contains one test case. Each case starts with a line containing0<N<100, the number of nod.

2020-08-13 13:46:19

基于pytorch的segnet实现,使用camvid数据集训练

程序结构基本与https://blog.csdn.net/haohulala/article/details/107660273这篇文章相似,有什么问题也可以参考这篇文章。import torchfrom torch import nnimport torch.nn.functional as fimport torchvisionimport torchvision.transforms as tfsfrom torch.utils.data import DataLoaderfrom to

2020-08-01 13:19:24

基于pytorch的FCN网络简单实现

参考知乎专栏实现FCN网络https://zhuanlan.zhihu.com/p/32506912import torchfrom torch import nnimport torch.nn.functional as fimport torchvisionimport torchvision.transforms as tfsfrom torch.utils.data import DataLoaderfrom torch.autograd import Variableimport

2020-07-29 11:48:48

使用pytorch进行迁移学习模型下载失败解决办法

使用pytorch进行迁移学习的时候,我们需要下载预训练的模型,但是这个模型通常很大,如果在代码中在线下载的话,很可能会中断,并且一中断之前也就白下载了,这篇文章里我介绍一种离线使用预训练模型的方法。所谓离线使用预训练模型的方法,实际上就是使用浏览器将模型下载下来(通常浏览器下载会比较稳定,并且如果下载中断还能恢复),下面给出各种模型的下载地址,只需要将对应的url键入到浏览器中就可以建立下载1. Resnet: model_urls = { 'resnet18': 'https:

2020-07-26 20:25:12

神经网络中转置卷积上采样与反最大池化上采样的对比

目前,CNN卷积神经网络中,图像上采样通常有两种方法,分别是FCN网络使用的转置卷积和segnet中使用的反最大池化。我们首先来看转置卷积的上采样方法,关于转置卷积的原理,可以看下面这篇文章https://blog.csdn.net/lanadeus/article/details/82534425我们知道,卷积网络输入输出尺寸的关系如下nout=nin−kernel+2∗paddingstride+1 n_{out} = \frac{n_{in}-kernel+2*padding} {stride

2020-07-23 20:51:55

1102 Invert a Binary Tree (25分)

The following is from Max Howell @twitter:Google: 90% of our engineers use the software you wrote (Homebrew), but you can't invert a binary tree on a whiteboard so fuck off.Now it's your turn to prove that YOU CAN invert a binary tree!Input Specifi

2020-07-16 19:16:10

pytorch CNN CIFAR10数据集识别

尝试使用深层结构进行CIFAR10的识别import torchimport torchvisionimport torchvision.transforms as transformsBATCH_SIZE = 64EPOCHES = 50NUM_WORKERS = 4LEARNING_RATE = 0.005# 数据转换transform = transforms.Compose( [transforms.ToTensor(), transforms.Normali

2020-07-16 14:55:24

1021 Deepest Root (25分)

A graph which is connected and acyclic can be considered a tree. The height of the tree depends on the selected root. Now you are supposed to find the root that results in a highest tree. Such a root is calledthe deepest root.Input Specification:Each .

2020-07-15 15:52:32

卷积动画详解

卷积算法这篇文章是深度学习中卷积算法的动画演示本教程的代码和图像可按以下规定免费使用许可并受适当的署名:[1] Vincent Dumoulin, Francesco Visin - A guide to convolution arithmeticfor deep learning(BibTeX)卷积动画蓝色的图像是输入图像,青色的图像是输出图像没有边界填充,无跨步(No padding, no strides)任意的边界填充,无跨步()半填充,无跨步(Half pa

2020-07-14 10:43:31

1020 Tree Traversals (25分)

Suppose that all the keys in a binary tree are distinct positive integers. Given the postorder and inorder traversal sequences, you are supposed to output the level order traversal sequence of the corresponding binary tree.Input Specification:Each inpu

2020-07-13 16:20:13

pytorch全连接神经网络进行MNIST识别

这个例程使用全连接神经网络进行MNIST识别import numpy as npimport torchfrom torchvision.datasets import mnistfrom torch import nnfrom torch.autograd import Variabledef data_tf(x): x = np.array(x, dtype="float32")/255 x = (x-0.5)/0.5 x = x.reshape((-1))

2020-07-13 11:42:22

1103 Integer Factorization (30分)

TheK−Pfactorization of a positive integerNis to writeNas the sum of theP-th power ofKpositive integers. You are supposed to write a program to find theK−Pfactorization ofNfor any positive integersN,KandP.Input Specification:Each input ...

2020-07-10 10:14:15

1037 Magic Coupon

The magic shop in Mars is offering some magic coupons. Each coupon has an integerNprinted on it, meaning that when you use this coupon with a product, you may getNtimes the value of that product back! What is more, the shop also offers some bonus produ...

2020-07-10 08:33:48

1033 To Fill or Not to Fill

1033To Fill or Not to Fill(25分)With highways available, driving a car from Hangzhou to any other city is easy. But since the tank capacity of a car is limited, we have to find gas stations on the way from time to time. Different gas station may give di..

2020-07-09 17:07:38

1003 Emergency (25分)

As an emergency rescue team leader of a city, you are given a special map of your country. The map shows several scattered cities connected by some roads. Amount of rescue teams in each city and the length of each road between any pair of cities are marked

2020-07-08 20:39:47

python构建决策树

决策树是一常见的机器学习算法,本例程将参考《机器学习实践》中的代码完成决策树算法中,信息熵被用来定义数据的纯度。假定当前样本集合D中第k类样本所占比例为$ p_k $,则有 Ent(D)=−∑k=1npklog⁡2pk Ent(D)=-\sum_{k=1}^n p_k \log_2 p_k Ent(D)=−k=1∑n​pk​log2​pk​我们认为Ent(D)的值越小,则D的纯度越高。假定离散属性a有V个可能的取值$ { a^1, a^2, …, a^V } $ ,若用a对样本集D进行划分,则会产

2020-07-03 12:22:06

pytorch 实现minist手写识别体

from 莫烦pythonimport torchimport torch.nn as nnimport torch.utils.data as Dataimport torchvision # 数据库模块import matplotlib.pyplot as plttorch.manual_seed(1) # reproducible# Hyper ParametersEPOCH = 1 # 训练整批数据多少次, 为了节约时间, 我们只训练一次

2020-06-24 10:07:24

查看更多

勋章 我的勋章
  • 签到新秀
    签到新秀
    累计签到获取,不积跬步,无以至千里,继续坚持!
  • 新人勋章
    新人勋章
    用户发布第一条blink获赞超过3个即可获得
  • 持之以恒
    持之以恒
    授予每个自然月内发布4篇或4篇以上原创或翻译IT博文的用户。不积跬步无以至千里,不积小流无以成江海,程序人生的精彩需要坚持不懈地积累!
  • 1024勋章
    1024勋章
    #1024程序员节#活动勋章,当日发布原创博客即可获得
  • 勤写标兵Lv4
    勤写标兵Lv4
    授予每个自然周发布9篇以上(包括9篇)原创IT博文的用户。本勋章将于次周周三上午根据用户上周的博文发布情况由系统自动颁发。
  • 学习力
    学习力
    《原力计划【第二季】》第一期主题勋章 ,第一期活动已经结束啦,小伙伴们可以去参加第二期打卡挑战活动获取更多勋章哦。
  • 原力新人
    原力新人
    在《原力计划【第二季】》打卡挑战活动中,成功参与本活动并发布一篇原创文章的博主,即可获得此勋章。
  • 分享小兵
    分享小兵
    成功上传3个资源即可获取