美女扒开腿免费视频_蜜桃传媒一区二区亚洲av_先锋影音av在线_少妇一级淫片免费放播放_日本泡妞xxxx免费视频软件_一色道久久88加勒比一_熟女少妇一区二区三区_老司机免费视频_潘金莲一级黄色片_精品国产精品国产精品_黑人巨大猛交丰满少妇

代做COMP9414、代寫C++,Java程序語言

時間:2024-06-20  來源:  作者: 我要糾錯



COMP9414 24T2
Artificial Intelligence
Assignment 1 - Artificial neural networks
Due: Week 5, Wednesday, 26 June 2024, 11:55 PM.
1 Problem context
Time Series Air Quality Prediction with Neural Networks: In this
assignment, you will delve into the realm of time series prediction using neural
network architectures. You will explore both classification and estimation
tasks using a publicly available dataset.
You will be provided with a dataset named “Air Quality,” [1] available
on the UCI Machine Learning Repository 1. We tailored this dataset for this
assignment and made some modifications. Therefore, please only use the
attached dataset for this assignment.
The given dataset contains 8,358 instances of hourly averaged responses
from an array of five metal oxide chemical sensors embedded in an air qual-
ity chemical multisensor device. The device was located in the field in a
significantly polluted area at road level within an Italian city. Data were
recorded from March 2004 to February 2005 (one year), representing the
longest freely available recordings of on-field deployed air quality chemical
sensor device responses. Ground truth hourly averaged concentrations for
carbon monoxide, non-methane hydrocarbons, benzene, total nitrogen ox-
ides, and nitrogen dioxide among other variables were provided by a co-
located reference-certified analyser. The variables included in the dataset
1https://archive.ics.uci.edu/dataset/360/air+quality
1
are listed in Table 1. Missing values within the dataset are tagged
with -200 value.
Table 1: Variables within the dataset.
Variable Meaning
CO(GT) True hourly averaged concentration of carbon monoxide
PT08.S1(CO) Hourly averaged sensor response
NMHC(GT) True hourly averaged overall Non Metanic HydroCar-
bons concentration
C6H6(GT) True hourly averaged Benzene concentration
PT08.S2(NMHC) Hourly averaged sensor response
NOx(GT) True hourly averaged NOx concentration
PT08.S3(NOx) Hourly averaged sensor response
NO2(GT) True hourly averaged NO2 concentration
PT08.S4(NO2) Hourly averaged sensor response
PT08.S5(O3) Hourly averaged sensor response
T Temperature
RH Relative Humidity
AH Absolute Humidity
2 Activities
This assignment focuses on two main objectives:
? Classification Task: You should develop a neural network that can
predict whether the concentration of Carbon Monoxide (CO) exceeds
a certain threshold – the mean of CO(GT) values – based on historical
air quality data. This task involves binary classification, where your
model learns to classify instances into two categories: above or below
the threshold. To determine the threshold, you must first calculate
the mean value for CO(GT), excluding unknown data (missing values).
Then, use this threshold to predict whether the value predicted by your
network is above or below it. You are free to choose and design your
own network, and there are no limitations on its structure. However,
your network should be capable of handling missing values.
2
? Regression Task: You should develop a neural network that can pre-
dict the concentration of Nitrogen Oxides (NOx) based on other air
quality features. This task involves estimating a continuous numeri-
cal value (NOx concentration) from the input features using regression
techniques. You are free to choose and design your own network and
there is no limitation on that, however, your model should be able to
deal with missing values.
In summary, the classification task aims to divide instances into two cat-
egories (exceeding or not exceeding CO(GT) threshold), while the regression
task aims to predict a continuous numerical value (NOx concentration).
2.1 Data preprocessing
It is expected you analyse the provided data and perform any required pre-
processing. Some of the tasks during preprocessing might include the ones
shown below; however, not all of them are necessary and you should evaluate
each of them against the results obtained.
(a) Identify variation range for input and output variables.
(b) Plot each variable to observe the overall behaviour of the process.
(c) In case outliers or missing data are detected correct the data accord-
ingly.
(d) Split the data for training and testing.
2.2 Design of the neural network
You should select and design neural architectures for addressing both the
classification and regression problem described above. In each case, consider
the following steps:
(a) Design the network and decide the number of layers, units, and their
respective activation functions.
(b) Remember it’s recommended your network accomplish the maximal
number of parameters Nw < (number of samples)/10.
(c) Create the neural network using Keras and TensorFlow.
3
2.3 Training
In this section, you have to train your proposed neural network. Consider
the following steps:
(a) Decide the training parameters such as loss function, optimizer, batch
size, learning rate, and episodes.
(b) Train the neural model and verify the loss values during the process.
(c) Verify possible overfitting problems.
2.4 Validating the neural model
Assess your results plotting training results and the network response for the
test inputs against the test targets. Compute error indexes to complement
the visual analysis.
(a) For the classification task, draw two different plots to illustrate your
results over different epochs. In the first plot, show the training and
validation loss over the epochs. In the second plot, show the training
and validation accuracy over the epochs. For example, Figure 1 and
Figure 2 show loss and classification accuracy plots for 100 epochs,
respectively.
Figure 1: Loss plot for the classifica-
tion task
Figure 2: Accuracy plot for the clas-
sification task
4
(b) For the classification task, compute a confusion matrix 2 including True
Positive (TP), True Negative (TN), False Positive (FP), and False Neg-
ative (FN), as shown in Table 2. Moreover, report accuracy and pre-
cision for your test data and mention the number of tested samples as
shown in Table 3 (the numbers shown in both tables are randomly cho-
sen and may not be consistent with each other). For instance, Sklearn
library offers a various range of metric functions 3, including confusion
matrix 4, accuracy, and precision. You can use Sklearn in-built met-
ric functions to calculate the mentioned metrics or develop your own
functions.
Table 2: Confusion matrix for the test data for the classification task.
Confusion Matrix Positive (Actual) Negative (Actual)
Positive (Predicted) 103 6
Negative (Predicted) 6 75
Table 3: Accuracy and precision for the test data for the classification task.
Accuracy Precision Number of Samples
CO(GT) classification 63% 60% 190
(c) For the regression task, draw two different plots to illustrate your re-
sults. In the first plot, show how the selected loss function varies for
both the training and validation through the epochs. In the second
plot, show the final estimation results for the validation test. For in-
stance, Figure 3 and Figure 4 show the loss function and the network
outputs vs the actual NOx(GT) values for a validation test, respec-
tively. In Figure 4 no data preprocessing has been performed, however,
as mentioned above, it is expected you include this in your assignment.
(d) For the regression task, report performance indexes including the Root
Mean Squared Error (RMSE), Mean Absolute Error (MAE) (see a
discussion on [2]), and the number of samples for your estimation of
2https://en.wikipedia.org/wiki/Confusion matrix
3https://scikit-learn.org/stable/api/sklearn.metrics.html
4https://scikitlearn.org/stable/modules/generated/sklearn.metrics.confusion matrix.html
5
Figure 3: Loss plot for the re-
gression task.
Figure 4: Estimated and actual NOx(GT)
for the validation set.
NOx(GT) values in a table. Root Mean Squared Error (RMSE) mea-
sures the differences between the observed values and predicted ones
and is defined as follows:
RMSE =

1
n
Σi=ni=1 (Yi ? Y?i)2, (1)
where n is the number of our samples, Yi is the actual label and Y?i
is the predicted value. In the same way, MAE can be defined as the
absolute average of errors as follows:
MAE =
1
n
Σi=ni=1 |Yi ? Y?i|. (2)
Table 4 shows an example of the performance indexes (all numbers are
randomly chosen and may not be consistent with each other). As men-
tioned before, Sklearn library offers a various range of metric functions,
including RMSE5 and MAE 6. You can use Sklearn in-built metric func-
tions to calculate the mentioned metrics or develop your own functions.
Table 4: Result table for the test data for the regression task.
RMSE MAE Number of Samples
90.60 50.35 55
5https://scikit-learn.org/stable/modules/generated/sklearn.metrics.root mean squared error.html
6https://scikit-learn.org/stable/modules/generated/sklearn.metrics.mean absolute error.html
6
3 Testing and discussing your code
As part of the assignment evaluation, your code will be tested by tutors along
with you in a discussion session carried out in the tutorial session in week 6.
The assignment has a total of 25 marks. The discussion is mandatory and,
therefore, we will not mark any assignment not discussed with tutors.
You are expected to propose and build neural models for classification
and regression tasks. The minimal output we expect to see are the results
mentioned above in Section 2.4. You will receive marks for each of these
subsection as shown in Table 5, i.e. 7 marks in total. However, it’s fine if
you want to include any other outcome to highlight particular aspects when
testing and discussing your code with your tutor.
For marking your results, you should be prepared to simulate your neural
model with a generalisation set we have saved apart for that purpose. You
must anticipate this by including in your submission a script ready to open
a file (with the same characteristics as the given dataset but with fewer data
points), simulate the network, and perform all the validation tests described
in Section 2.4 (b) and (d) (accuracy, precision, RMSE, MAE). It is recom-
mended to save all of your hyper-parameters and weights (your model in
general) so you can call your network and perform the analysis later in your
discussion session.
As for the classification task, you need to compute accuracy and precision,
while for the regression task RMSE and MAE using the generalisation set.
You will receive 3 marks for each task, given successful results. Expected
results should be as follows:
? For the classification task, your network should achieve at least 85%
accuracy and precision. Accuracy and precision lower than that will
result in a score of 0 marks for that specific section.
? For the regression task, it is expected to achieve an RMSE of at most
280 and an MAE of 220 for unseen data points. Errors higher than the
mentioned values will be marked as 0 marks.
Finally, you will receive 1 mark for code readability for each task, and
your tutor will also give you a maximum of 5 marks for each task depending
on the level of code understanding as follows: 5. Outstanding, 4. Great,
3. Fair, 2. Low, 1. Deficient, 0. No answer.
7
Table 5: Marks for each task.
Task Marks
Results obtained with given dataset
Loss and accuracy plots for classification task 2 marks
Confusion matrix and accuracy and precision tables for classifi-
cation task
2 marks
Loss and estimated NOx(GT) plots for regression task 2 marks
Performance indexes table for regression task 1 mark
Results obtained with generalisation dataset
Accuracy and precision for classification task 3 marks
RMSE and MAE for regression task 3 marks
Code understanding and discussion
Code readability for classification task 1 mark
Code readability for regression task 1 mark
Code understanding and discussion for classification task 5 mark
Code understanding and discussion for regression task 5 mark
Total marks 25 marks
4 Submitting your assignment
The assignment must be done individually. You must submit your assignment
solution by Moodle. This will consist of a single .ipynb Jupyter file. This file
should contain all the necessary code for reading files, data preprocessing,
network architecture, and result evaluations. Additionally, your file should
include short text descriptions to help markers better understand your code.
Please be mindful that providing clean and easy-to-read code is a part of
your assignment.
Please indicate your full name and your zID at the top of the file as a
comment. You can submit as many times as you like before the deadline –
later submissions overwrite earlier ones. After submitting your file a good
practice is to take a screenshot of it for future reference.
Late submission penalty: UNSW has a standard late submission
penalty of 5% per day from your mark, capped at five days from the as-
sessment deadline, after that students cannot submit the assignment.
8
5 Deadline and questions
Deadline: Week 5, Wednesday 26 June of June 2024, 11:55pm. Please
use the forum on Moodle to ask questions related to the project. We will
prioritise questions asked in the forum. However, you should not share your
code to avoid making it public and possible plagiarism. If that’s the case,
use the course email cs9414@cse.unsw.edu.au as alternative.
Although we try to answer questions as quickly as possible, we might take
up to 1 or 2 business days to reply, therefore, last-moment questions might
not be answered timely.
6 Plagiarism policy
Your program must be entirely your own work. Plagiarism detection software
might be used to compare submissions pairwise (including submissions for
any similar projects from previous years) and serious penalties will be applied,
particularly in the case of repeat offences.
Do not copy from others. Do not allow anyone to see your code.
Please refer to the UNSW Policy on Academic Honesty and Plagiarism if you
require further clarification on this matter.
References
[1] De Vito, S., Massera, E., Piga, M., Martinotto, L. and Di Francia, G.,
2008. On field calibration of an electronic nose for benzene estimation in an
urban pollution monitoring scenario. Sensors and Actuators B: Chemical,
129(2), pp.750-757.
[2] Hodson, T. O. 2022. Root mean square error (RMSE) or mean absolute
error (MAE): When to use them or not. Geoscientific Model Development
Discussions, 2022, 1-10.

請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp













 

標簽:

掃一掃在手機打開當前頁
  • 上一篇:代寫指標編寫 編寫同花順指標公式 代編公式
  • 下一篇:ECON2101代做、代寫Python/c++設計編程
  • CMT219代寫、代做Java程序語言
  • 代做MATH1033、代寫c/c++,Java程序語言
  • 代做CSCI 2525、c/c++,Java程序語言代寫
  • COMP 315代寫、Java程序語言代做
  • 昆明生活資訊

    昆明圖文信息
    蝴蝶泉(4A)-大理旅游
    蝴蝶泉(4A)-大理旅游
    油炸竹蟲
    油炸竹蟲
    酸筍煮魚(雞)
    酸筍煮魚(雞)
    竹筒飯
    竹筒飯
    香茅草烤魚
    香茅草烤魚
    檸檬烤魚
    檸檬烤魚
    昆明西山國家級風景名勝區
    昆明西山國家級風景名勝區
    昆明旅游索道攻略
    昆明旅游索道攻略
  • 短信驗證碼平臺 理財 WPS下載

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    美女扒开腿免费视频_蜜桃传媒一区二区亚洲av_先锋影音av在线_少妇一级淫片免费放播放_日本泡妞xxxx免费视频软件_一色道久久88加勒比一_熟女少妇一区二区三区_老司机免费视频_潘金莲一级黄色片_精品国产精品国产精品_黑人巨大猛交丰满少妇
    yjizz视频| 国产男女猛烈无遮挡a片漫画| 在线观看成人毛片| 特一级黄色录像| 日韩免费av一区| 国产美女福利视频| 9.1人成人免费视频网站| 日本中文字幕精品| www.av成人| 99国产精品免费视频| wwwxx日本| aaaaaav| 国产传媒在线看| 国产精品夜夜夜爽阿娇| 人妻人人澡人人添人人爽| 校园春色 亚洲| 国产白袜脚足j棉袜在线观看| 99久久人妻精品免费二区| 亚洲国产精品自拍视频| 精品无码国产污污污免费网站 | 日本理论中文字幕| 四虎影视1304t| 佐山爱在线视频| 毛片网站免费观看| 我要看黄色一级片| 天天躁日日躁狠狠躁免费麻豆| 伊人网综合视频| 欧美成人短视频| 中文字幕人妻一区| 日韩黄色中文字幕| 中文字幕 亚洲一区| 污污视频网站在线免费观看| 99riav国产精品视频| 国产精品揄拍100视频| 欧美色图亚洲视频| 国产全是老熟女太爽了| 四虎国产精品永久免费观看视频| 大地资源二中文在线影视观看| 天天做夜夜爱爱爱| 五月天综合视频| 性高潮久久久久久| 黄大色黄女片18免费| 丰满少妇中文字幕| 最近中文字幕免费| 中文字幕a在线观看| 中文字幕久久久久久久| 中文字幕资源站| 东京热无码av男人的天堂| 久久精品女同亚洲女同13| 91porn在线视频| 杨钰莹一级淫片aaaaaa播放| 性少妇xx生活| 日本综合在线观看| 日韩中文字幕有码| 日本精品在线观看视频| 全黄一级裸体片| 在线观看国产网站| 欧美熟妇精品黑人巨大一二三区| 精品国产免费久久久久久婷婷| 中文字幕美女视频| 美女视频久久久| 卡通动漫亚洲综合| 美女的奶胸大爽爽大片| 91高清免费看| 久草网站在线观看| 又色又爽又黄18网站| 国模无码视频一区| 野外性满足hd| 欧美激情亚洲色图| 亚洲一级理论片| 国产精品 欧美激情| 中文字幕制服丝袜| 亚洲欧美视频在线播放| 深爱五月激情网| 国产精成人品免费观看| 日本黄色片免费观看| 国产裸体视频网站| 青青草成人免费视频| 成人国产精品久久久网站| 国产成人精品视频免费| 日韩黄色免费观看| 给我免费观看片在线电影的| 一区二区精品免费| 超碰手机在线观看| 天堂久久久久久| 久久久久久久久久97| 极品魔鬼身材女神啪啪精品| 手机免费看av片| 国产精品久久免费观看| 日韩av成人网| 熟女俱乐部一区二区视频在线| 大胸美女被爆操| 逼特逼视频在线观看| 99自拍偷拍视频| 日韩av手机在线播放| 黄色香蕉视频在线观看| 亚洲第一黄色网址| 久久久久无码精品| 国产在线免费av| 91丨porny丨对白| 黄色一级大片在线免费观看| yy6080午夜| 免费观看污网站| 日韩精品123区| 人妻精品久久久久中文| 亚洲v在线观看| 在线看的片片片免费| 国产美女免费网站| 一本加勒比北条麻妃| 国产综合内射日韩久| 少妇高潮惨叫久久久久| a毛片毛片av永久免费| av免费观看不卡| 精品国产午夜福利在线观看| 国产稀缺精品盗摄盗拍| 精品无码在线观看| 亚洲自拍偷拍一区二区| 中国av免费看| 动漫美女无遮挡免费| 美女又黄又免费的视频| 国产88在线观看入口| 91成人福利视频| 希岛爱理中文字幕| 久热这里有精品| 午夜视频在线免费看| aaa黄色大片| 久久久午夜精品福利内容| 亚洲中文字幕一区| 风间由美一二三区av片| 亚洲精品成人无码熟妇在线| 91成人破解版| 日韩不卡av在线| 三级黄色录像视频| 午夜国产福利一区二区| 久久黄色一级视频| 精品国产aⅴ一区二区三区东京热| 日本人妻一区二区三区| 日本久久久久久久久久| theav精尽人亡av| 日韩不卡av在线| 亚洲综合中文网| 黄色a一级视频| 黑人狂躁日本娇小| 免费日本黄色网址| 日本 欧美 国产| 免费看毛片的网站| 中国1级黄色片| 在线中文字日产幕| 亚洲自拍偷拍图| 一卡二卡三卡四卡五卡| 泷泽萝拉在线播放| 少妇久久久久久被弄高潮| 丰满少妇一区二区三区| 久久国产高清视频| 性欧美丰满熟妇xxxx性仙踪林| 永久免费看片直接| 日本一区二区在线免费观看| 少妇太紧太爽又黄又硬又爽小说| 四虎国产精品永久免费观看视频| 美女久久久久久久久久| 麻豆网站免费观看| 天堂网av2018| 亚洲码无人客一区二区三区| 四虎国产精品免费| 殴美一级黄色片| 少妇大叫太粗太大爽一区二区| 久久久久无码精品| 日韩精品久久久久久久的张开腿让| 中文在线观看免费视频| 免费在线观看一级片| av资源在线免费观看| 娇妻被老王脔到高潮失禁视频| 精品人妻人人做人人爽夜夜爽| 欧美88888| 538精品视频| 非洲一级黄色片| 中文字幕丰满孑伦无码专区| 四虎精品一区二区| 亚洲国产精品第一页| 韩国三级hd中文字幕有哪些| 国产女人18水真多毛片18精品| 黄大色黄女片18免费| 少妇视频在线播放| 国产又粗又长免费视频| 成人免费视频入口| 久久久无码人妻精品一区| 色综合久久久无码中文字幕波多| 国产va在线播放| 18深夜在线观看免费视频| 欧洲猛交xxxx乱大交3| 少妇影院在线观看| 制服丝袜在线第一页| 玖玖爱在线观看| 久久久精品成人| 免费成年人视频在线观看| 国产精品国产精品88| 无码人妻精品一区二区三| 中文在线一区二区三区| 久久久久无码精品国产sm果冻 | 无码国产69精品久久久久同性|