GVKun编程网logo

[LeetCode] 1003. Check If Word Is Valid After Substitutions

16

如果您对[LeetCode]1003.CheckIfWordIsValidAfterSubstitutions感兴趣,那么这篇文章一定是您不可错过的。我们将详细讲解[LeetCode]1003.Che

如果您对[LeetCode] 1003. Check If Word Is Valid After Substitutions感兴趣,那么这篇文章一定是您不可错过的。我们将详细讲解[LeetCode] 1003. Check If Word Is Valid After Substitutions的各种细节,此外还有关于0030. Substring with Concatenation of All Words (H)、ApplicationMaster: TimeoutException: Futures timed out after [100000 milliseconds]??这是啥问题、IConnectionMultiplexer Redis缓存的NSubstitute单元测试、LeetCode 003. Longest Substring Without Repeating的实用技巧。

本文目录一览:

[LeetCode] 1003. Check If Word Is Valid After Substitutions

[LeetCode] 1003. Check If Word Is Valid After Substitutions

We are given that the string "abc" is valid.

From any valid string V, we may split V into two pieces X and Y such that X + Y (X concatenated with Y) is equal to V.  (X or Y may be empty.)  Then, X + "abc" + Y is also valid.

If for example S = "abc", then examples of valid strings are: "abc", "aabcbc", "abcabc", "abcabcababcc".  Examples of invalid strings are: "abccba""ab""cababc""bac".

Return true if and only if the given string S is valid.

 

Example 1:

Input: "aabcbc"
Output: true
Explanation: 
We start with the valid string "abc".
Then we can insert another "abc" between "a" and "bc", resulting in "a" + "abc" + "bc" which is "aabcbc".

Example 2:

Input: "abcabcababcc"
Output: true
Explanation: 
"abcabcabc" is valid after consecutive insertings of "abc".
Then we can insert "abc" before the last letter, resulting in "abcabcab" + "abc" + "c" which is "abcabcababcc".

Example 3:

Input: "abccba"
Output: false

Example 4:

Input: "cababc"
Output: false

 

Note:

  1. 1 <= S.length <= 20000
  2. S[i] is ''a''''b'', or ''c''

 

Brute force solution, O(n^2), scan through the entire string and remove all "abc" patterns, repeat this process until the string is empty or reaches a stable state that no "abc" can be found.

class Solution {
    public boolean isValid(String S) {
        String before = S, after = null;
        while(!before.equals("")) {
            after = before.replaceAll("abc", "");
            if(after.length() == before.length()) {
                return false;
            }
            before = after;
        }
        return true;
    }
}

 

Optimal solution with Stack, O(n) runtime, O(n) space

Intuition: each time we remove "abc", we need to keep the previously visted characters in the same sequence as before the removal. Each time we see a ''c'', we need to check its two closest two predecessors are ''a'' and ''b''. This requires LIFO, hence we should use stack.

class Solution {
    public boolean isValid(String S) {
        Stack<Character> stack = new Stack<>();
        for(char c : S.toCharArray()) {
            if(c == ''c'') {
                if(stack.size() < 2) {
                    return false;
                }
                char b = stack.pop();
                char a = stack.pop();
                if(a != ''a'' || b != ''b'') {
                    return false;
                }
            }
            else {
                stack.push(c);
            }
        }
        return stack.size() == 0;
    }
}

 

0030. Substring with Concatenation of All Words (H)

0030. Substring with Concatenation of All Words (H)

Substring with Concatenation of All Words (H)

题目

You are given a string, s, and a list of words, words, that are all of the same length. Find all starting indices of substring(s) in s that is a concatenation of each word in words exactly once and without any intervening characters.

Example 1:

Input:
  s = "barfoothefoobarman",
  words = ["foo","bar"]
Output: [0,9]
Explanation: Substrings starting at index 0 and 9 are "barfoo" and "foobar" respectively.
The output order does not matter, returning [9,0] is fine too.

Example 2:

Input:
  s = "wordgoodgoodgoodbestword",
  words = ["word","good","best","word"]
Output: []

题意

给定一个字符串s和单词数组words,判断s中是否存在一个子串,正好为words中所有单词连起来的字符串(顺序不计)。

思路

暴力法:直接将每个单词看做是一个字符进行匹配,因为单词的顺序不一定,可以用HashMap来记录单词和出现次数的对应关系。

\(O(N)\)优化:暴力法的缺陷在于进行了大量的重复计算。参考 LeetCode 30. Substring with Concatenation of All Words 进行优化。


代码实现

Java

暴力

class Solution {
    public List<Integer> findSubstring(String s, String[] words) {
        if (words.length == 0) {
            return new ArrayList<>();
        }
      
        int len = words[0].length(), num = words.length;
        List<Integer> list = new ArrayList<>();
        Map<String, Integer> record = new HashMap<>();
        Map<String, Integer> tmp = new HashMap<>();
      
        for (String word : words) {
            record.putIfAbsent(word, 0);
            record.put(word, record.get(word) + 1);
        }
      
        for (int i = 0; i < s.length(); i++) {
            if (s.length() - i < num * len) {
                break;
            }
          
						tmp.clear();
            int j = i;
          
            while (j - i != num * len) {
                String w = s.substring(j, j + len);
                if (!record.containsKey(w)) {
                    break;
                }
                tmp.putIfAbsent(w, 0);
                tmp.put(w, tmp.get(w) + 1);
                if (tmp.get(w) > record.get(w)) {
                    break;
                }
                j += len;
            }
          
            if (j - i == num * len) {
                list.add(i);
            }
        }
        return list;
    }
}

\(O(N)\)优化

class Solution {
    public List<Integer> findSubstring(String s, String[] words) {
        if (words.length == 0) {
            return new ArrayList<>();
        }
      
        int len = words[0].length(), num = words.length;
        List<Integer> list = new ArrayList<>();
        Map<String, Integer> record = new HashMap<>();
        Map<String, Integer> tmp = new HashMap<>();
      
        for (String word : words) {
            record.putIfAbsent(word, 0);
            record.put(word, record.get(word) + 1);
        }

        for (int i = 0; i < len && i < s.length(); i++) {
            if (s.length() - i < num * len) {
                break;
            }
          
            tmp.clear();
            int left = i, right = i;
          
            while (true) {
                if (right - left == num * len) {
                    list.add(left);
                    String headW = s.substring(left, left + len);
                    tmp.put(headW, record.get(headW) - 1);
                    left += len;
                }
              
                if (s.length() - left < num * len) {
                    break;
                }
              
                String w = s.substring(right, right + len);
              
                if (!record.containsKey(w)) {
                    tmp.clear();
                    left = right + len;
                } else {
                    tmp.putIfAbsent(w, 0);
                    tmp.put(w, tmp.get(w) + 1);
                    while (tmp.get(w) > record.get(w) && s.length() - left >= num * len) {
                        String headW = s.substring(left, left + len);
                        tmp.put(headW, tmp.get(headW) - 1);
                        left += len;
                    }
                }
              
                right += len;
            }
        }

        return list;
    }
}

ApplicationMaster: TimeoutException: Futures timed out after [100000 milliseconds]??这是啥问题

ApplicationMaster: TimeoutException: Futures timed out after [100000 milliseconds]??这是啥问题

Kafka version: 2.2.1-cdh6.3.0
20/03/30 15:19:03 INFO utils.AppInfoParser: Kafka commitId: null
20/03/30 15:19:03 INFO consumer.KafkaConsumer: [Consumer clientId=consumer-1, groupId=MaltrailIncidentCount032118ffb885-120e-422f-9b35-c18b32b80922] Subscribed to topic(s): rsd.sensor.incident.log
20/03/30 15:20:40 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:223)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:227)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
20/03/30 15:20:40 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:223)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:227)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
)
20/03/30 15:20:40 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/03/30 15:20:40 INFO server.AbstractConnector: Stopped Spark@44ec9751{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
20/03/30 15:20:40 INFO ui.SparkUI: Stopped Spark web UI at http://slave3:43670
20/03/30 15:20:40 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/03/30 15:20:40 INFO memory.MemoryStore: MemoryStore cleared
20/03/30 15:20:40 INFO storage.BlockManager: BlockManager stopped
20/03/30 15:20:40 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/03/30 15:20:40 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/03/30 15:20:40 INFO spark.SparkContext: Successfully stopped SparkContext
20/03/30 15:20:40 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://rongan/user/hbase/.sparkStaging/application_1585536197032_0071
20/03/30 15:20:40 INFO util.ShutdownHookManager: Shutdown hook called
20/03/30 15:20:40 INFO util.ShutdownHookManager: Deleting directory /data/yarn/nm2/usercache/hbase/appcache/application_1585536197032_0071/spark-27e2d897-1130-4278-a924-fe6498b9689f


[2020-03-30 15:20:41.476]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
d = null
    ssl.keystore.type = JKS
    ssl.protocol = TLS
    ssl.provider = null
    ssl.secure.random.implementation = null
    ssl.trustmanager.algorithm = PKIX
    ssl.truststore.location = null
    ssl.truststore.password = null
    ssl.truststore.type = JKS
    value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer

20/03/30 15:19:03 INFO utils.AppInfoParser: Kafka version: 2.2.1-cdh6.3.0
20/03/30 15:19:03 INFO utils.AppInfoParser: Kafka commitId: null
20/03/30 15:19:03 INFO consumer.KafkaConsumer: [Consumer clientId=consumer-1, groupId=MaltrailIncidentCount032118ffb885-120e-422f-9b35-c18b32b80922] Subscribed to topic(s): rsd.sensor.incident.log
20/03/30 15:20:40 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:223)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:227)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
20/03/30 15:20:40 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:223)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:227)
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:447)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:275)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:805)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:804)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:804)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
)
20/03/30 15:20:40 INFO spark.SparkContext: Invoking stop() from shutdown hook
20/03/30 15:20:40 INFO server.AbstractConnector: Stopped Spark@44ec9751{HTTP/1.1,[http/1.1]}{0.0.0.0:0}
20/03/30 15:20:40 INFO ui.SparkUI: Stopped Spark web UI at http://slave3:43670
20/03/30 15:20:40 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/03/30 15:20:40 INFO memory.MemoryStore: MemoryStore cleared
20/03/30 15:20:40 INFO storage.BlockManager: BlockManager stopped
20/03/30 15:20:40 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
20/03/30 15:20:40 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/03/30 15:20:40 INFO spark.SparkContext: Successfully stopped SparkContext
20/03/30 15:20:40 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://rongan/user/hbase/.sparkStaging/application_1585536197032_0071
20/03/30 15:20:40 INFO util.ShutdownHookManager: Shutdown hook called
20/03/30 15:20:40 INFO util.ShutdownHookManager: Deleting directory /data/yarn/nm2/usercache/hbase/appcache/application_1585536197032_0071/spark-27e2d897-1130-4278-a924-fe6498b9689f


For more detailed output, check the application tracking page: http://master:8088/cluster/app/application_1585536197032_0071 Then click on links to logs of each attempt.
. Failing the application.
Exception in thread "main" org.apache.spark.SparkException: Application application_1585536197032_0071 finished with failed status
    at org.apache.spark.deploy.yarn.Client.run(Client.scala:1158)
    at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1606)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

IConnectionMultiplexer Redis缓存的NSubstitute单元测试

IConnectionMultiplexer Redis缓存的NSubstitute单元测试

如何解决IConnectionMultiplexer Redis缓存的NSubstitute单元测试?

那是我使用Redis的课程:

 public class RateLimitMiddleware
{
    private readonly RequestDelegate _next;
    private readonly IConnectionMultiplexer _connectionMultiplexer;
    private readonly Options _options;

    public RateLimitMiddleware(RequestDelegate next,IConnectionMultiplexer connectionMultiplexer,Options options)
    {
        if (next == null)
        {
            throw new ArgumentNullException(nameof(next));
        }

        if (connectionMultiplexer == null)
        {
            throw new ArgumentNullException(nameof(connectionMultiplexer));
        }

        if (options == null)
        {
            throw new ArgumentNullException(nameof(options));
        }

        _next = next;
        _connectionMultiplexer = connectionMultiplexer;
        _options = options;
    }

    public async Task Invoke(HttpContext context)
    {
            var requestsKeyStore = _connectionMultiplexer.GetDatabase();

            var consumerIP = context.Connection.RemoteIpAddress.ToString();

            var consumerKey = $"consumer.throttle#{consumerIP}";

            var requestkeyval = await requestsKeyStore.HashIncrementAsync(consumerKey,1);

            if (requestkeyval == 1)
            {
                await requestsKeyStore.KeyExpireAsync(
                    consumerKey,_options.RateLimitKeyExpire,CommandFlags.FireAndForget);
            }
            else if (requestkeyval > _options.RateLimit)
            {
                context.Response.StatusCode = StatusCodes.Status429TooManyRequests;
                using (var writer = new StreamWriter(context.Response.Body))
                {
                    await writer.WriteAsync("Too many requests.");
                }

                return;
            }

            await _next(context);
    }
}

我正在尝试通过NSubstitute进行测试:

public async Task CheckRateLimit_Should_Fail_On_Too_Much_Requests()
    {
        var context = new DefaultHttpContext();
        context.Connection.RemoteIpAddress = IPAddress.Parse("127.0.0.1");
        var ip = context.Connection.RemoteIpAddress.ToString();
    
        var _connectionMultiplexer = Substitute.For<IConnectionMultiplexer>();
        _connectionMultiplexer.IsConnected.Returns(false);
        var multiplexerDb = Substitute.For<IDatabase>();
        _connectionMultiplexer
            .GetDatabase(Arg.Any<Int32>(),Arg.Any<Object>()).Returns(multiplexerDb);


        var logger = Substitute.For<ILogger<RateLimitMiddleware>>();
        var middleware = new RateLimitMiddleware(
            innerHttpContext => Task.CompletedTask,_connectionMultiplexer,_options,logger
        );

        var startMSecond = DateTime.Now.TimeOfDay.TotalMilliseconds;
        var curMSecond = startMSecond;
        var count = 0;
        while ((curMSecond - startMSecond) < 1000 && count <= _options.RateLimit)
        {
            await middleware.Invoke(context);
            curMSecond = DateTime.Now.TimeOfDay.TotalMilliseconds;
            count++;
        }

        Assert.AreEqual(429,context.Response.StatusCode);
    }

我在测试类中将_options.RateLimit设置为2。因此,在我的while循环中,我每秒发送2个以上的请求,因此应该获得429个响应。

当我没有模拟IConnectionMultiplexer并使用localhost连接到Redis时,它起作用了。现在不是。调试后,我知道在对中间件的每个请求中,调用它都会获得没有任何键的新的clear redis数据库。因此,假设即使使用相同的密钥,每个请求都是新的。 我在做什么错了?

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)

LeetCode 003. Longest Substring Without Repeating

LeetCode 003. Longest Substring Without Repeating

Given a string, find the length of the longest substring without repeating characters. For example, the longest substring without repeating letters for "abcabcbb" is "abc", which the length is 3. For "bbbbb" the longest substring is "b", with the length of 1.

看错题了,题目要求找出字符串中的一个最长的子串,使得这个子串不包含重复的字符,这个子串的长度。看成了找出一个最长重复子串,它不包含重复的字符。题目给的两个例子也恰好符合我的臆想,想想也是醉了。

写了一些臆想的代码:

unordered_set<char> tmp;
bool is_unique(string& s)
{
    tmp.clear();
    for(int i=0; i!= s.size(); ++i)
    {
        if(tmp.find(s[i]) != tmp.end())
            return false;
        tmp.insert(s[i]);
    }
    return true;
}

class Solution {
public:
    int lengthOfLongestSubstring(string s) {
        unordered_set<string> allsubstrings;
        int maxlength = 1;
        int i, j;
        string str;
        for(i=0; i!= s.size(); ++i)
        {
            for(j =i+maxlength; j<= s.size(); ++j)
            {
                str = s.substr(i,j-i);
                if(is_unique(str))
                {
                    if(allsubstrings.find(str) != allsubstrings.end())
                    {
                        if(maxlength < str.size())
                            maxlength = str.size();
                    }
                    else
                    {
                        allsubstrings.insert(str);
                    }
                }
                else
                {
                    break;
                }
            }
        }
        return maxlength;
    }
};
原题的解:

int lengthOfLongestSubstring(string s) {
  int n = s.length();
  int i = 0, j = 0;
  int maxLen = 0;
  bool exist[256] = { false };
  while (j < n) {
    if (exist[s[j]]) {
      maxLen = max(maxLen, j-i);
      while (s[i] != s[j]) {
        exist[s[i]] = false;
        i++;
      }
      i++;
      j++;
    } else {
      exist[s[j]] = true;
      j++;
    }
  }
  maxLen = max(maxLen, n-i);
  return maxLen;
}




今天的关于[LeetCode] 1003. Check If Word Is Valid After Substitutions的分享已经结束,谢谢您的关注,如果想了解更多关于0030. Substring with Concatenation of All Words (H)、ApplicationMaster: TimeoutException: Futures timed out after [100000 milliseconds]??这是啥问题、IConnectionMultiplexer Redis缓存的NSubstitute单元测试、LeetCode 003. Longest Substring Without Repeating的相关知识,请在本站进行查询。

本文标签: