Translation

简介: Translation

文章目录

一、Translation

总结


一、Translation

本题链接:


题目:

A. Translation

time limit per test2 seconds

memory limit per test256 megabytes

inputstandard input

outputstandard output

The translation from the Berland language into the Birland language is not an easy task. Those languages are very similar: a berlandish word differs from a birlandish word with the same meaning a little: it is spelled (and pronounced) reversely. For example, a Berlandish word code corresponds to a Birlandish word edoc. However, it’s easy to make a mistake during the «translation». Vasya translated word s from Berlandish into Birlandish as t. Help him: find out if he translated the word correctly.


Input

The first line contains word s, the second line contains word t. The words consist of lowercase Latin letters. The input data do not consist unnecessary spaces. The words are not empty and their lengths do not exceed 100 symbols.


Output

If the word t is a word s, written reversely, print YES, otherwise print NO.


Examples

input

code

edoc

output

YES


input

abb

aba

output

NO


input

code

code

output

NO


本博客给出本题截图:

image.png

题意:输入两个字符串,如果一个字符串reverse一遍正好和另一个字符串一样的话,就输出YES,否则输出NO

AC代码1

#include <iostream>
#include <string>
using namespace std;
int main()
{
  string a, b;
  cin >> a >> b;
  bool flag = false;
  for (int i = 0, j = b.size() - 1; i < a.size(); i ++, j -- )
    if (a[i] != b[j])
    {
      flag = true;
      break;
    }
  if (flag) puts("NO");
  else puts("YES");
  return 0;
}

AC代码2

#include <iostream>
#include <string>
#include <algorithm>
using namespace std;
int main()
{
  string a, b;
  cin >> a >> b;
  reverse(a.begin(), a.end());
  if (a == b) puts("YES");
  else puts("NO");
  return 0;
}

总结

水题,不解释


目录
相关文章
|
10月前
|
机器学习/深度学习 Python TensorFlow
[seq2seq]论文实现:Effective Approaches to Attention-based Neural Machine Translation(下)
[seq2seq]论文实现:Effective Approaches to Attention-based Neural Machine Translation(下)
64 1
|
10月前
|
TensorFlow 算法框架/工具
[seq2seq]论文实现:Effective Approaches to Attention-based Neural Machine Translation(上)
[seq2seq]论文实现:Effective Approaches to Attention-based Neural Machine Translation(上)
67 1
|
10月前
|
机器学习/深度学习 自然语言处理 对象存储
[wordpiece]论文分析:Google’s Neural Machine Translation System
[wordpiece]论文分析:Google’s Neural Machine Translation System
125 1
|
10月前
|
机器学习/深度学习 自然语言处理 算法
[BPE]论文实现:Neural Machine Translation of Rare Words with Subword Units
[BPE]论文实现:Neural Machine Translation of Rare Words with Subword Units
67 0
|
10月前
|
自然语言处理 算法 Python
[SentencePiece]论文解读:SentencePiece: A simple and language independent subword tokenizer...
[SentencePiece]论文解读:SentencePiece: A simple and language independent subword tokenizer...
175 0
|
10月前
|
机器学习/深度学习 自然语言处理
机器翻译(Machine Translation, MT)
机器翻译(Machine Translation, MT)
175 1
|
机器学习/深度学习 编解码 自然语言处理
SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers论文解读
我们提出了SegFormer,一个简单,高效而强大的语义分割框架,它将transformer与轻量级多层感知器(MLP)解码器统一起来。
937 0
|
机器学习/深度学习 编解码 文字识别
Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images(一)
Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images
167 0
Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images(一)
|
机器学习/深度学习 编解码 文字识别
Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images(二)
Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images
249 0
Hybrid Multiple Attention Network for Semantic Segmentation in Aerial Images(二)
《NATURAL LANGUAGE UNDERSTANDING WITH MACHINE ANNOTATORS & DEEP LEARNED ONTOLOGIES AT SCALE》电子版地址
NATURAL LANGUAGE UNDERSTANDING WITH MACHINE ANNOTATORS & DEEP LEARNED ONTOLOGIES AT SCALE
110 0
《NATURAL LANGUAGE UNDERSTANDING WITH MACHINE ANNOTATORS & DEEP LEARNED ONTOLOGIES AT SCALE》电子版地址